Color Generation in IBM CGA, EGA and VGA

(September 9, 2018)

It started quite innocently with a few Twitter threads about retro DOS gaming. The question was why no games (or any other software, for that matter) that were designed for EGA graphics cards made use of the additional colors EGA could offer. It’s widely known that EGA cards had a reprogrammable palette and could show any 16 out of 64 available colors on screen; but still, all software of the time just used the 16 default colors that were already possible with CGA graphics. Some explanations to this phenomenon were discussed, but it was mostly centered around what was possible and not why. Falling prey to nerd sniping, I dug deeper and deeper into the topic, including writing test programs in BASIC and Pascal, and now I finally understand everything about color generation in CGA, EGA and VGA cards.

CGA

The original Color Graphics Adapter (CGA) card from 1981 has two video outputs: composite analog video and digital RGBI. I’m going to ignore the composite output in this article; to see what’s up with that (and how it can be misused to great effect), look at this great article. I will focus on RGBI here, which is output from the card via a DE-9 connector carrying six digital (TTL) signals: Horizontal and vertical synchronization, Red, Green, Blue and Intensity. The remaining three pins on the connector are unused or wired to ground.

The Red, Green and Blue signals, when high, make the monitor show RGB primary colors with two thirds of its full brightness. The fourth signal, Intensity, adds the remaining third of the monitor’s brightness by adding a pedestral to all channels. Black with Intensity thus doesn’t stay black, but becomes dark gray, and the colors are rendered with slightly reduced saturation. (For comparison, another contemporary computer system with an RGBI-style palette, the Sinclair ZX Spectrum, used »multiplicative« intensity, i.e. the I bit only modified the level of the enabled RGB bits, keeping black black.)

The »naïve« 16-color RGBI palette.

This is however not what a true RGBI monitor displays. For reasons unknown, a proper monitor modifies the »dark yellow« color into brown by halving its green level, just like if the G bit was forced off and I was enabled for the green channel only (which isn’t normally possible with RGBI).

Actual 16-color RGBI palette with modified brown.

From a software point of view, the RGBI colors are mapped into 4-bit nibbles in the order IRGB, i.e. I is bit 3, R is bit 2, G is bit 1, and B is bit 0. For example, light gray is color index 7, bright red is 12, and the peculiar brown is index 6. (The two figures above already use this order.)

The CGA card is built from a Motorola MC6845 CRTC display controller, 16 KiB of DRAM, a character set ROM chip and dozens of 74-series TTL chips. In addition to the twelve registers of the CRTC and some lightpen stuff we’re not going into detail about here, it has two 6-bit registers that control the display mode and color interpretation. Four of these bits form an RGBI color index; I’m going to call this the »color register« from now on. The way colors are generated and the color register is interpreted varies greatly between the three basic display modes of the CGA:

  • Text Mode: 40×25 or 80×25 cells of 8×8 pixels each. The video RAM holds two bytes per cell: An index into character ROM, describing the character that shall be shown, and an attribute byte, which contains a nibble each for the RGBI foreground and background color of the cell. With a configuration bit, the interpretation of the background color’s I bit can be changed to mean blinking instead. In that mode, background Intensity is always forced low, and half of the time (toggling every few frames), the foreground color is ignored in cells with an enabled background I bit. The color register is used for the overscan (»border«) area, a small slip of a few pixels outside the actual display area on each edge.
  • High-Resolution Graphics: 640×200 pixels of monochrome bitmap graphics. The video RAM contains one bit for each display pixel. Background and overscan are always black; the foreground color can be selected with the color register, but only globally for the whole screen.
  • Multi-Color Graphics: 320×200 pixels of 2-bpp (4-color) bitmap graphics. Each pair of bits in the video RAM selects one of four colors. The colors are semi-fixed: For index 1, 2 and 3, there’s a choice from six built-in, hardwired palettes. The color register selects the color for index 0 and the overscan area.

The palettes for multi-color mode come in three pairs of two related palettes which only differ in whether the intensity bit is set.

CGA palette 0 (BIOS mode 4 palette 0)

Note how black doesn’t change into dark gray for the high-intensity version? That’s not a mistake; remember that index 0 can be chosen idependently from all 16 RGBI colors, regardless of whether the high-intensity palette is selected for the remaining three colors. In fact, you can consider black in this (and the two following) figures as merely a placeholder for any color you want; black is just the default.

CGA palette 1 (BIOS mode 4 palette 1)

The high-intensity variant of palette 1 is most common one, because it’s the standard palette that’s activated by the BIOS when entering CGA multi-color graphics mode (mode 4). BIOS calls exist to switch between palettes 0 and 1 and select the background color, but Intensity can only be turned off using direct register access.

The only technical difference between palettes 0 and 1 is that the blue bit is set for all colors in palette 1.

CGA palette 2 (BIOS mode 5)

The third palette is a slightly unofficial one. It is activated in BIOS mode 5, which is officially »grayscale graphics«. There is no proper grayscale mode in CGA though; the term only applies to the composite video output, where the color difference signal can be turned off. On the RGBI interface, however, the CGA still happily outputs color; in fact, mode 4 palette 1 and mode 5 behave identical, except that magenta is replaced by red. The register bit and BIOS call that change palettes in mode 4 have no effect in mode 5 though.

EGA

The Enhanced Graphics Adapter (EGA), introduced in 1984, adds four major features over CGA: it increases the amount of video RAM to 64 or 128 KiB, adds support for planar 4-bit (16-color) bitmap graphics modes, makes the palette fully programmable with any 16 out of 64 colors, and introduces 350-line display modes for much crisper text and graphics output.

The new 16-color graphics modes use a rather complex video memory addressing scheme. Instead of the simple 4-pixels-in-a-byte packing of CGA’s 4-color graphics mode, the data is split into so-called bitplanes, each of which stores one of the four bits of the color value, for eight adjacent pixels per byte. All of these bitplanes map to the same memory address though; the CPU has to select which bitplanes to read or write by setting registers on the card first. This is a bit cumbersome to say the least, but it has its merits too: The EGA card’s memory access circuitry (the »Graphics Controller«) can be configured to do things like write to multiple bitplanes at once, copy data in video memory at four bytes per cycle, manipulate individual pixels on across bitplanes etc. Describing all this would take too much space here and isn’t the point of this article anyway; if you want to learn more about the EGA’s datapath, I recommend the relevant chapters of Michael Abrash’s good old Graphics Programming Black Book (don’t be surprised that it’s saying VGA instead of EGA all the time – the mechanism with the four latches and ALUs is the same in EGA).

EGA is not register compatible with CGA, even the CRTC is a different one. The monitor connector, however, remains the same DE-9 plug, except that EGA assigns two of the hitherto unused pins to additional color signals. Specifically, the single intensity signal of CGA is split into three distinct intensity signals in EGA. As a result, each RGB color channel can have one of four intensities. In total, that’s 64 colors, or 6-bit RGB (2 bits per channel).

EGA 6-bit, 64-color palette (reordered to look nicer)

All of EGA’s 1/2/4-bit video data goes through a chip called the Attribute Controller, which contains a small 16×6-bit RAM that translates the 4-bit palette index into the 6-bit output color. (The order of the bits in the palette registers is a bit peculiar; the figures I’m showing here don’t follow the actual order of the values in the palette.) When the BIOS sets up a video mode, it programs the palette with values that are equivalent to what the CGA would generate. In 4-color modes, switching between high-intensity palettes 0 and 1 and selecting the background color is also possible using the BIOS, but anything beyond that requires direct palette register manipulation. Furthermore, BIOS modes 4 and 5 are exactly identical on EGA, so mode 5 does not select CGA palette 2 (at least not on the original IBM EGA BIOS; clones may or may not implement this, I don’t know).

EGA is meant to offer a smooth upgrade path for owners of CGA hardware. As such, an EGA card can be used on a CGA monitor and an EGA monitor can be used on a CGA card. This has some serious ramifications.

First, when a CGA monitor is attached to an EGA card, it will obviously use only one intensity signal instead of three. Due to the way the connector is wired, that intensity signal is the least significant bit of the green channel. Naturally, this reduces the choice of colors to the 16 RGBI ones, and is also subject to the dark-yellow-to-brown modification. Reprogramming the EGA palette thus won’t give any new colors compared to CGA, only the order of the RGBI colors may be changed.

Second, when an EGA monitor is connected to a CGA card, it will only receive one intensity signal instead of three. The other two pins may be wired to ground, or floating, or whatever – the monitor can’t rely on anything about them. All it can do is emulate a CGA monitor: ignore the extra intensity signals altogether, replicate the green LSB into the red and blue channels, and finally apply the dark-yellow-to-brown modification. Easy.

But wait – how does the EGA monitor know that it needs to emulate a CGA monitor in the first place? There’s no switch or spare pin for that; it has to guess from the signal. This is possible because of differences in video timing: All CGA (and CGA-compatible) modes use 200 active display lines and a 15.7 kHz horizontal sync frequency, while the new EGA modes with 350 lines use 21.85 kHz instead. What the EGA monitor thus does is use the full 6 bits of RGB information only in 350-line modes (as these can only be generated by a real EGA card), and enable CGA RGBI emulation in 200-line modes – after all, chances are that it might be connected to an actual CGA card that expects this behavior; it simply can’t tell.

The collateral damage from this compatibility decision is that in all 200-line modes, including the EGA’s new 16-color 320×200 and 640×200 modes, the palette is limited to the 16 RGBI colors known from CGA, even with a proper EGA monitor attached. The order of the colors in the palette may still be modified, but the 48 colors not in the CGA’s RGBI palette are unattainable, as two out of six color bits are simply ignored by the monitor.

what remains of the EGA palette when interpreted as RGBI

Another interesting side effect of this CGA compatibility malarkey is that even the palette that’s set up by the BIOS differs a bit between 200-line and 350-line modes – specifically, the value for brown (index 6) is different. In 350-line modes, the monitor receives raw 6-bit RGB data that’s not interpreted by the monitor in any special way, so the EGA card is responsible for emulating the dark-yellow-to-brown modification that a CGA monitor would perform. Thus, is sets palette index 6 to 2/3 red 1/3 green 0/3 blue. In 200-line modes, however, the monitor would interpret this as the RGBI color »high-intensity red« because the green channel has its main intensity bit unset, but its least-significant bit is set, and thus the green LSB is interpreted as the global intensity bit for all channels! So, the card needs to output actual dark yellow and have it turned into brown by the monitor, as is the case with a pure CGA system.

Despite all the technical restrictions that result from CGA/EGA cross-compatibility, it’s not a true »plug and play« system: The EGA card has a hardware switch (DIP, jumper, or similar) to tell it whether the attached monitor is EGA-capable or not. The main function of this switch is to configure whether the text modes shall use 350 lines with 8×14-pixel character cells or classic CGA 200-line mode with an 8×8-pixel font.

VGA

In 1987, the Video Graphics Adapter Array (VGA) was introduced. (According to Wikipedia, the term »Array« was used instead of »Adapter« because VGA was always a single-chip design.) It comes with at least 256 KiB of video memory, adds a new resolution (640×480) and changes the timing of the CGA/EGA-compatible 200-line 60 Hz modes to 400 lines at 70 Hz with line doubling. More importantly though, it is able to show up to 256 on-screen colors, freely selectable from a palette of 262,144 colors, i.e. 6 bits per RGB channel. Transporting this over 18 digital signal lines becomes unwieldy, so VGA uses analog transmission for the color signals instead. In a certain way, a VGA monitor thus is a significantly »dumber« device than an EGA monitor, because it has no business in color interpretation at all: it simply amplifies the color signals produced by the VGA card’s digital-to-analog converter (DAC) and sends them to the picture tube. Specifically, it can’t to things like RGBI interpretation and yellow-to-brown modification because that’s only possible with a digital signal.

VGA palette
VGA 18-bit, 262,144-color palette

VGA is register compatible with EGA; in particular, all the fancy bitplane stuff is still in place, and so is the color generation logic for 16-color modes. As a result, the VGA performs two palette lookups in 4- and 16-color modes: First, the 4 or 16 color indices are translated to 6-bit RGB using the EGA-compatible palette registers. The resulting 64 colors are then translated again into the full 18-bit RGB data for the VGA’s DAC. When switching into a 4- or 16-color mode, the BIOS configures the EGA palette exactly as it did on an actual EGA card, and additionally sets the first 64 entries of the VGA’s 256-color palette to match the EGA’s output – or, to be more precise, the EGA monitor’s output. After all, the effects of the EGA monitor’s CGA RGBI emulation mode need to be emulated as well, and in a VGA system, this is the card’s duty, not the monitor’s. Thus, there are two different 64-color palettes used by the BIOS: The »normal« one with 64 distinct 6-bit RGB colors, which is used in 350-line, 480-line and text modes, and the RGBI one with only 16 colors (each repeated four times in a special pattern), including the yellow-to-brown modification, for the 400-line modes that replace and emulate the EGA’s 200-line graphics modes. This way, applications from the CGA and EGA era look and behave exactly as they used to, even if they modify the EGA palette.

The »new« modes in VGA work more or less as extensions to the EGA ones. As far as color generation is concerned, 640×480 graphics mode works just like EGA’s 640×350 graphics mode. The VGA text modes act like EGA’s 350-line text mode, even though they use 400 lines and 9×16-pixel cells – in that way, they are an exception from the »400-line mode means CGA color emulation« rule, because the full EGA color set is available there. The 256-color mode is also rather straightforward: the EGA palette translation step is bypassed in this mode and everything goes directly through the VGA palette registers. The EGA’s bitplane logic, though, is still mostly functional in 256-color mode, except that it’s no longer about four bitplanes, but rather four-way pixel interleaving: The first pixel of the screen goes into plane 0, the next one goes into 1, then 2, then 3, then 0 again, and so on. To make this a little easier to use, VGA introduces so-called »chain 4« mode, in which the least significant two bits of the address are decoded as the plane index. This effectively reverses the effect of plane separation, and makes working with 320x200x256 mode (»mode 13h«, named for its hexadecimal BIOS mode number) very easy: Every pixel is a directly-addressable byte in the A0000h memory segment. The downside of this is that only 64 KiB of the VGA’s 256 KiB video memory are accessible this way, but this can be worked around by disabling chain-4 mode again and just putting up with the separate planes. This is a widely-used technique commonly known as »Mode X«.

Conclusion

This wraps up our trip through the intricacies of color generation on the first three generations of IBM PC color graphics standards. Now we can finally answer the question that lead me to this:

Q: Why did no software reprogram the EGA palette to get more varied colors in 320×200 graphics mode?

A: Because due to complicated reasons that have to do with CGA/EGA cross-compatibility, 48 out of the EGA’s 64 possible colors are simply unobtainable in 200-line graphics modes, and are displayed as aliases of the 16 base colors. Reprogramming the palette can only ever rearrange the 16 default colors. A freely programmable palette in 16-color modes is only possible with VGA.

Phew.

Appendix

At the beginning of the article, I mentioned that I now finally understand everything about color generation on CGA/EGA/VGA. Well, that was a lie. There’s one piece of the puzzle that I do not understand and that can only be tested by someone who has a real EGA monitor (which I don’t have). It’s about the way how an EGA monitor detects whether is has to enable CGA RGBI emulation or not. I wrote that it does so by checking the timing of the incoming signal, but that’s not necessarily true. There’s one more thing that’s done by the EGA card when switching between 200- and 350-line modes, and that’s reversing the sync polarity of the vertical sync signal. In 200-line modes, it’s usually low and only becomes high for a few lines once a frame, and in 350-line modes, it’s inverted (mostly high; low once a frame). If I had to make a guess, I’d say it’s far easier for the monitor’s electronics to rely on this fact to detect which mode to use, instead of counting sync pulses: If the monitor detects positive sync polarity, it enables CGA RGBI emulation; if if detects negative sync polarity, it disables it. This actually works fine in an emulator (PCem at least): By turning on negative sync polarity, with all other video timing parameters still set to 200-line mode, we can get the full 64 EGA colors! The question is whether this works on an actual monitor, because it may very well use the detected mode information for display timing too. Confusing the monitor with the sync polarity trick may thus result in an unstable or otherwise distorted image – or it may work just fine. I don’t know.

Long story short, if you happen to have a retro PC setup with a real EGA card and a real EGA monitor, it would be awesome if you could run a little test BASIC program on it and tell me what happens!

Update (2021-01-18): Somebody tried the test program, and it didn’t work out as I hoped for: The monitor indeed uses the sync polarity to switch between timings, so the display becomes completely unusable when fudging around with that. So it still holds true after 35 years: There’s no way to use more than the default 16 colors on EGA in 200-line modes.

1 Response to »Color Generation in IBM CGA, EGA and VGA«

Post a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Captcha:
 

By submitting the comment, you agree to the terms of the Privacy Policy.