7

As a teenager with a BBC Micro in the 1980s, I would often write small graphical programs, and - like many programmers of the day - I was very familiar with all the modes and the colours associated with them.

Comparing the BBC Micro mode 2 colour palette (16 colours consisting of 8 'real' colours and 8 'flashing' colours) to other computers of the day, especially the C64 and Amstrad 664, I would wonder why the designers of the BBC chose flashing colours for 8 of the colours available in mode 2, and not 8 'original' colours. I would have loved to be able to use colours like brown, purple, shades of grey, and a 'flesh tone' (the latter name being a product of it's time - assumed to be a caucasian skin colour).

Many years later, I decided that the colour palette selected must have been a limitation of the hardware rather than just an arbitrary choice.

My question is: How expensive would it have been to provide 8 more 'real' colours instead of the flashing colours? Or was there some other reason the palette didn't extend beyond the 8 'real' colours, e.g. compatibility with older BBC models?

10
  • 2
    Videotex was still a thing in those days, so (wild guesswork) maybe the Micro had potential application as a videotex terminal, which as far as I recall used flashing characters to annoy users. Commented Jan 15, 2024 at 1:41
  • 2
    One of the main reasons that caused old computers to use flashing (or underscore, for that matter) in favor of more colors was that flash - while typically occupying at least one bit in video memory - is purely internal to the video circuitry and doesn't need an outside pin. With the same bit in video memory, you could theoretically double the amount of colours, but to get it to the monitor, you needed an outside pin - which was expensive on custom chips, so many vendors introduced flash as a (rather useless) feature. Commented Jan 15, 2024 at 8:27
  • 1
    For the many users that didn’t have a colour monitor but instead a black-and-white ‘portable’ CRT television, multiple colours were somewhat less useful than having (one) flashing option. Commented Jan 15, 2024 at 18:03
  • 2
    To be pedantic: the 1981 BBC Micro predates the CPC by a full three years. Strict same-market contemporaries of the CPC would be things like the 256-colour Enterprise. Commented Jan 15, 2024 at 22:21
  • 1
    Flashing text may have been desirable in Mode 2, given that it had to be present in Mode 7 (which used the SAA5050 Teletext character generator rather than the 6845 CRT controller used in graphics modes). There weren't any "older" BBC home computers, but there were earlier Acorn computers - perhaps that's what you meant? Commented Jan 16, 2024 at 16:54

3 Answers 3

7

From the Wikipedia article:

Modes 0 to 6 could display colours from a logical palette of sixteen: the eight basic colours at the vertices of the RGB colour cube and eight flashing colours made by alternating the basic colour with its inverse.

"the vertices of the RGB colour cube" is a fancy way of saying:

  • Red - on or off (1 bit)
  • Green - on or off (1 bit)
  • Blue - on or off (1 bit)

Which gives you 8 colors:

  • Black (all off)
  • Red (red on)
  • Green (green on)
  • Blue (blue one)
  • Yellow (red and green on)
  • Magenta (red and blue on)
  • Cyan (green and blue on)
  • White (red, green and blue on)

These colors were quite common in many computers of the time, and live on in HTML. Note also that Cyan/Magenta/Yellow is very common in color printers (inkjet and laser).

Since 4 bits is more logical in a binary computer than 3 bits, a lot of systems used 4 bits for color in some fashion. The BBC Micro used it for flashing, which is great for getting attention. Other systems used it in other ways. The IBM PC CGA adapter used the 4th bit as intensity, with one exception - instead of "dark yellow", the CGA monitor had some special circuitry to make a more useful brown color.

There are two basic ways to get more colors - more bits or a lookup table. In fact, the BBC Micro graphics and text modes actually used 1 or 2 or 3 bits - not even the full 4 bits of possible "colors", using a lookup table of some sort to match to 2 or 4 or 8 of the available colors. More bits = more RAM, which cost directly in hardware and also limited the amount of memory available for everything else, so this was a common situation in 8-bit and even 16-bit (e.g., IBM PC with CGA) computers.

A more extensive lookup table - e.g., still just 3 bits per pixel but from an 8 bit or possibly larger selection of colors - was certainly possible, and some machines did that, such as the Atari 400/800. But 8 or 16 colors, with only some colors available depending on the mode, was extremely common - IBM PC CGA, Apple ][, Commodore 64, VIC 20, etc.

Sticking to 1 bit for each part of the video signal - Red, Green, Blue - keeps the circuitry extremely simple. Flashing by inverting the signal is also pretty simple, as is the IBM PC CGA (and others) intensity level. Anything beyond that gets relatively complex by the standards of the time.

6
  • 1
    If you allow for one intensity level, four is not much more complicated - But it needs external pins on your circuitry to get it to the monitor, which is in fact a problem.... Flashing is purely internal to the video circuitry and doesn't need additional pins Commented Jan 15, 2024 at 8:30
  • 1
    For the record, colour flashing on the BBC Micro was controlled by a hardware register bit that the software modified as needed during its 10 ms regular timer interrupt to get the required flash intervals. Commented Jan 15, 2024 at 19:10
  • 4
    The "brown" circuit was a trait of the monitor, rather than the CGA, and wasn't present in all monitors. I myself preferred the "amber" color which was produced by monitors without the circuit than the "brown" color produced by those with it. Commented Jan 15, 2024 at 19:42
  • @supercat You're right. Faulty memory (mine, not the computer...) Commented Jan 15, 2024 at 19:53
  • For me, I liked having what I would call "8" spare colour slots. This allowed me to have 8 colours, then 8 colours I could use for palette based animations via the VDU19 call. It was an effect that myself (And Henley of TYB fame) used to great effect to produce scrolling landscapes and Tron like grids in the demos we wrote for the BBC. Commented Jul 14, 2024 at 13:19
0

Supplementary answer to others

The Atari 2600 Video Games Console had at least 104 colours in several major world regions (PAL, and 128 on NTSC) (also Wikipedia) and this was released in 1977 with a very basic graphics system but which was more challenging to program for.

With that, I would say it ought to have been feasible for the BBC Micro, given that with the release date of the 2600 being 1977, it pre-dates the BBC Micro by a few years.

Of course, today, there's VideoNULA (and YouTube). Which demonstrates the versatility of the BBC Micro to have this extension.

1
  • Conversely the 2600 has only 8 colours in SECAM, its internal logic elsewhere being directly and exclusively in chrominance and luminance. There’s no consumer device able to run a monitor with that palette at that time. With its 80-column text modes, the BBC not only supports a monitor but encourages one. Commented Jul 15, 2024 at 20:52
0

I suspect it was a BBC requirement, but there were palette extenders from 1982 including the one the nula was inspired by. I would have thought making the flashing bit into an average with grey would have been fairly easy and require minimal extra fast logic. The real question might be, why didn't they add extra colours for the master and do the flashing in software?

1
  • I remember seeing those palette extenders in magazines, and lusting after them. Did any commercial software 'in the day' (e.g. games, art packages) use them, other than what might have been packaged with the palette extender hardware? Commented Jul 22, 2024 at 23:20

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.