258
« on: October 02, 2018, 11:11:55 PM »
I'm trying to finish up the help file and iron out the last few bugs (that I know of; I'm sure there are plenty that others will find) in WinAGI v1.2.3. In the meantime, I saw a few posts talking about fonts, so I thought I would share some of what I've learned about how AGI handles fonts. These notes are for MSDOS versions; I have not spent any time studying other platforms.
Where do font glyphs come from?
The 'glyphs' are the actual bitmap representation of each character as it is displayed on the screen. On MSDOS systems using EGA monitors, AGI uses the MSDOS built in fonts, available using standard MSDOS interrupts.
The original font set developed for MSDOS was the 7 bit ASCII standard. Since characters were stored as eight bit numbers, it didn't take long for people to develop an extra set of 128 characters (known as 'extended characters').
The full set of characters (including extended characters) included in MSDOS was known as IBM Code Page 437. It included letters with diacritics so other languages such as Spanish and French could be represented. There were also a set of characters added that could be used to draw simple boxes and shapes. (Code Page 437 is no longer the standard for extended characters; unicode values for 80h-FFh are not the same. This is why most text editors will show nonsensical characters if you try to display native 8 bit MSDOS text.)
MSDOS stored the character glyph data in two places; for the original 7-bit ASCII characters, glyph data are stored at location F000h:FA6Eh (INT 43h). The extended character glyph data are stored at INT 1Fh.
On the Hercules Graphics Card (HGC) things were significantly different. The HGC only had one text mode (720 x 350 pixels) and one graphics mode (720 x 348 pixels). For technical reasons, in graphics mode, number of rows had to be a multiple of 4. AGI did not use the text mode, because that mode used an 80 x 25 character format to be compatible with the Monocrhome Display Adapter; AGI needed a 40 x 25 text display. So Sierra used the HGC's graphics mode, and wrote custom code to handle display of text.
Since they were not using built-in glyphs, Sierra created a set of custom font glyphs in a file called HGC_FONT, but it only included glyphs for the 7 bit ASCII characters; no extended characters. The format of the HGC_FONT file matches the native format that HGC would use to draw pixels on screen, which was an interleaving format where rows of pixels were not written sequentially. Each glyph used 24 bytes (twelve rows of two bytes, for a 16 wide by 12 high bitmap; interesting choice since each line of text on the HGC display uses 14 pixels). To extract the glyph bitmaps, arrange each character's 24 bytes in this format:
byte02:byte03
byte00:byte01
byte06:byte07
byte04:byte05
byte10:byte11
byte08:byte09
byte14:byte15
byte12:byte13
byte18:byte19
byte16:byte17
byte22:byte23
byte20:byte21
If you do the math, you will see that at 14 pixels high, 25 rows will result in 350 total pixels of height. But as mentioned above, in the graphics mode HGC only displays 348 pixels; Sierra addressed this by clipping the top and bottom rows of text by one pixel. It is almost impossible to detect when visually inspecting the screen output.
How does the character get drawn to the screen?
On HGC screens, AGI included custom built routines in the HGC_GRAF.OVL file that would extract the correct glyph from the HGC_FONT file and draw it directly on the display. If an extended character was encountered, it was just ignored. This means that on an HGC system, it is impossible to display any extended characters.
On EGA screens, Sierra took advantage of the MSDOS built in glyphs. But they didn't keep it simple.
On the text screen (using the text.screen() command) Sierra stuck to using MSDOS interrupts to handle cursor positioning and drawing glyphs; this means that on the text screen, all characters, including the extended characters will draw correctly, regardless of colors chosen for foreground and background. As an example, in Goldrush, extended characters are used to simulated pages in a book when the bible verses are displayed.
On the graphics screen, Sierra decided to complicate things. For one thing, regardless of color values passed with the set.text.attribute (FG, BG) command, background would only be black (if BG == 0) or white (if BG != 0). When the background was white, foreground color would always displayed as black, regardless of the value of FG. If background was black, the displayed foreground color would match the FG value.
In the code that handles character input to the screen, if the background color is black, then AGI uses the standard interrupts to display the character, so all characters including extended characters display correctly. But if the background is not black, AGI does something weird - it copies the character glyph from the MSDOS interrupt for 7 bit ASCII characters (INT 43h), inverts it (creating a black-on-white character, instead of white-on-black) and stores it in a separate location. AGI then reassigns INT 43h so that it will find this new location when the character drawing interrupt (INT 10h AH=09) is called. So if an extended character is encountered, AGI copies glyph data from the place where only the 7 bit characters are stored, reading data that belongs to other functions in MSDOS. This explains why glyphs for extended characters appear garbled when displayed on a non-black background on the graphics screen.
So, why did Sierra do this? It appears to be related to the methods they use to force black-on-white on the graphics screen. I can't figure out why they felt the need to limit color choices though.
Is it possible to get any other color combinations on the graphics screen?
Yes. AGI is notorious for failing to validate the values of arguments used in various commands. Due to a bug in how colors are handled, you can actually get AGI to display black text on non-white backgrounds by passing a FG value is greater than 127 to the set.text.attribute (FG, BG) command. BG must be zero. When you do this, the upper four bits of FG overflow into BG when the character is drawn, resulting in black text on colored background. Extended characters are still garbled in this scenario.
What about character byte values less than 32?
It is not well known, but if you include character values less than 32 in AGI messages, AGI will display the original MSDOS glyphs for these characters. AGI actually does this in order to get the arrow to show up in the save/restore game windows when you are selecting a slot. The only exceptions to this are characters 8,9,10 and 13; AGI moves the cursor as appropriate for these characters.