Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

4
  • Thank you for this. I've commented the code and also posted a link to the PDF of the big program I'm working on. The user input can be in any format. Right now, if you look at the code, it's in ASCII in the display area (eg 1 = B1) and converted to hex numbers in the memory area (e.g. 1 = 01) which I created for mathematical manipulation. It turns out I should use C1 for 1. Converting the input to the correct format isn't a problem. I just need to know how to get that data into the proper subroutines and then extract the result in a format I can print in the "calculator display." Commented Apr 4, 2021 at 14:09
  • ASCII is a 7-bit code, but it appears the Apple expects the high bit set. '1' in ASCII is 0x31, or 0xC1 if you set the high bit. But terminology note - in memory it's just bits, it's not inherently hex or decimal or anything. It only depends on how you look at it. Commented Apr 4, 2021 at 15:26
  • @another-dave Yes and no. It's true, the Apple uses a set high bit on key codes, as this signals a fresh keystroke, and yes, coding in Video-RAM is even more mangled, but no, in BASIC, as the high bit is cleared. This is true for any data, as well as for tokenized BASIC. After all, it's a rather plain MS-BASIC with tokens occupying the range above $7F. Commented Apr 4, 2021 at 15:56
  • I can convert the user input (eg "123" to 31 32 33) for use in with FRMNUM. That's not the issue. Commented Apr 4, 2021 at 16:39