Skip to main content
Detailing
Source Link
Patrick Schlüter
  • 4.9k
  • 1
  • 18
  • 25

TL/DR computer speed didn't really matter as it was still order of magnitude faster than no computer.


BASIC interpreters in 8 bit computers were slow. There are already enough answers explaining why, mainly because they traded space for performance, the interpreter had to fit in less that 16 KiB. Most machines had either Basic in ROM (Apple II, CBM, TRS-80, etc.) or loaded into RAM (Sharp MZ-80, CP/M machines etc.) which constrained the size more than the performance.

The performance was not that important as you have to realize what computers were replacing: scientific and programmable calculators and slide rules. 

While games were the motor of progress in the 80s, before that theythe pioneers of computers were used to replace by engineers and schoolsthat needed computations for mathematicaltheir work and school teachers to do calculations, replacing. These were the people than also used programmable and scientific calculators, mainly TI and HP, or even slide rulers etc. Magazines at that time would present computers more as an extension to programmable calculators that a category of itself (my late brother started his career in 1981 as engineer with a slide ruler and a non programmable scientific calculator). It was only later, when people realized computers could also function as a programmable game console that the speed of basic became inadequate. For calculation purposes it was plenty enough for the requirements of the time.

So even a slow interpreter was leagues ahead of what could be processed "by hand" at that time.

It is only a bit later that the competition and the benchmark mentality pushed (dubious) optimizations for the sake of performance. IMHO binary floats were one of these dubious optimizations (I grew up on TI-99/4A and Sharp Pocket computer Basic, both using decimal floats, and was shocked how bad Applesoft floats were. They were fast, but barely usable for my needs in school).

TL/DR computer speed didn't really matter as it was still order of magnitude faster than no computer.


BASIC interpreters in 8 bit computers were slow. There are already enough answers explaining why, mainly because they traded space for performance, the interpreter had to fit in less that 16 KiB. Most machines had either Basic in ROM (Apple II, CBM, TRS-80, etc.) or loaded into RAM (Sharp MZ-80, CP/M machines etc.) which constrained the size more than the performance.

The performance was not that important as you have to realize what computers were replacing. While games were the motor of progress in the 80s, before that they were used to replace by engineers and schools for mathematical calculations, replacing programmable and scientific calculators, mainly TI and HP, or even slide rulers (my late brother started his career in 1981 as engineer with a slide ruler and a non programmable scientific calculator).

So even a slow interpreter was leagues ahead of what could be processed "by hand" at that time.

It is only a bit later that the competition and the benchmark mentality pushed (dubious) optimizations for the sake of performance. IMHO binary floats were one of these dubious optimizations (I grew up on TI-99/4A and Sharp Pocket computer Basic, both using decimal floats, and was shocked how bad Applesoft floats were. They were fast, but barely usable for my needs in school).

TL/DR computer speed didn't really matter as it was still order of magnitude faster than no computer.


BASIC interpreters in 8 bit computers were slow. There are already enough answers explaining why, mainly because they traded space for performance, the interpreter had to fit in less that 16 KiB. Most machines had either Basic in ROM (Apple II, CBM, TRS-80, etc.) or loaded into RAM (Sharp MZ-80, CP/M machines etc.) which constrained the size more than the performance.

The performance was not that important as you have to realize what computers were replacing: scientific and programmable calculators and slide rules. 

While games were the motor of progress in the 80s, the pioneers of computers were engineers that needed computations for their work and school teachers to do calculations. These were the people than also used programmable calculators etc. Magazines at that time would present computers more as an extension to programmable calculators that a category of itself (my late brother started his career in 1981 as engineer with a slide ruler and a scientific calculator). It was only later, when people realized computers could also function as a programmable game console that the speed of basic became inadequate. For calculation purposes it was plenty enough for the requirements of the time.

So even a slow interpreter was leagues ahead of what could be processed "by hand" at that time.

It is only a bit later that the competition and the benchmark mentality pushed (dubious) optimizations for the sake of performance. IMHO binary floats were one of these dubious optimizations (I grew up on TI-99/4A and Sharp Pocket computer Basic, both using decimal floats, and was shocked how bad Applesoft floats were. They were fast, but barely usable for my needs in school).

Source Link
Patrick Schlüter
  • 4.9k
  • 1
  • 18
  • 25

TL/DR computer speed didn't really matter as it was still order of magnitude faster than no computer.


BASIC interpreters in 8 bit computers were slow. There are already enough answers explaining why, mainly because they traded space for performance, the interpreter had to fit in less that 16 KiB. Most machines had either Basic in ROM (Apple II, CBM, TRS-80, etc.) or loaded into RAM (Sharp MZ-80, CP/M machines etc.) which constrained the size more than the performance.

The performance was not that important as you have to realize what computers were replacing. While games were the motor of progress in the 80s, before that they were used to replace by engineers and schools for mathematical calculations, replacing programmable and scientific calculators, mainly TI and HP, or even slide rulers (my late brother started his career in 1981 as engineer with a slide ruler and a non programmable scientific calculator).

So even a slow interpreter was leagues ahead of what could be processed "by hand" at that time.

It is only a bit later that the competition and the benchmark mentality pushed (dubious) optimizations for the sake of performance. IMHO binary floats were one of these dubious optimizations (I grew up on TI-99/4A and Sharp Pocket computer Basic, both using decimal floats, and was shocked how bad Applesoft floats were. They were fast, but barely usable for my needs in school).