Sunday, February 19, 2017

The Atari Game: GLIB by Mike Montana

This essay is printed here with the kind permission of the author, Mike Montana. Mike is the son of Rich Montana, who was the programmer of Glib for the Atari 2600. You can hear my interview with Mike in episode 134. I am indebted to Mike for all of his help with this episode. If you have any questions for Mike, please send them to me at 2600gamebygame@gmail.com, and I will make sure he gets them. Thank you for reading!


*********************************************************************************

Concept Genesis

Selchow & Righter are known for pretty much one thing – the classic board game “Scrabble,” where points are made by forming words drawn from random letter-tiles, each letter having a particular point value.

In the early 80s' execs at the company wanted to expand the brand into the marketing explosion "all things Atari." Friends of friends were contacted and ultimately Pete Farentinos of Qualtronic Devices in Smithtown Long Island was brought in. His company was selected as they were the regional distributor of Rockwell’s 6502 processor – which, as you know, is the CPU in the Atari 2600. Qualtronics and Selchow & Righter’s marketing people put together a game concept that was essentially Scrabble. Each player would be assigned 7 random letters, and together, in real time, they’d race to complete a word for the most points. Multiple players would run across a field of letters, grab one (thus denying the other player), and run the letter to the word-block (imagine Wheel of Fortune meets the Hunger Games) Variants on the idea were pushed around, and the game-concept was christened “Head of the Class”.

Qualtronics brought in Rich Montana of NJ to evaluate the feasibility of writing the game on the Atari. He had often worked with Qualtronics as their electronics-engineer for many 4 and 8 bit CPU related projects (Mattel’s hand-held football game was the Qualtronics/Montana team’s previous project, which was on a 4bit CPU).

Getting Technical

At the time, for consumer grade products with CPUs, there was no Operating System. There was no BIOS. There was often enough, no ROM with prebuilt functions for interfacing with the hardware. The Atari 2600 was as close to useless as such a product could be. No pre-packaged ROM set, no BIOS, meaning, there was no published API on how to write software for it. Writing software for it was more about understanding which interrupts were available, how to interface with the Sound and “Video” chip, and which bytes of zero-page memory were used by the hardware (and how it could be used). Knowing how to write software for the Atari was a closely guarded secret and the “SDK,” which was more of a collection of electronic parts data-sheets than a list of function pointers, was not public domain. It was available through Sunnyvale’s very expensive licensing. Key to getting the “Head of the Class” cartridge off the ground was getting that documentation. A few calls, a few wink winks, and a few weeks later a Xerox copied sub-set of somebody’s design documentation was delivered in a nondescript three-ring-binder to Rich so that he could begin. The cost of the Xerox copies and binder was a then-staggering $25,000.

Getting Started

After reading through the design-documents, and weighing the design constraints of 1982 technology, Rich looped back with the Marketing people and burst the bubble between “expectations” and “reality”. Cost was, and is, a driving factor in any product development. “High-Tech Consumer Products” were even more expensive – the typical Atari cartridge was 4k of ROM. This was the sweet spot of cost/availability/useful-size. Selchow&Righter weren’t looking to become a software-house. They wanted a product to expand the brand, and make some money. They were not interested in bank-switching ROM nonsense, and they were not interested in doubling the component cost by stepping up to the next available sized ROMs.  All these constraints made perfectly good sense.

Except: with 4k of ROM, there would simply be no room for a dictionary. The reality was clear – there’d be no way for the game to judge if the word created was legitimate. Probably a fatal flaw already, it was decided that the “other player” would “accept” or “reject” the formed word.

Except: if the “other” player was the judge on word acceptance, there could only be 1 active player at a time.  The Hunger Games would have to wait another 35 years.

Except: Atari’s player missile graphics had real hardware trouble with getting more than 8 “high-res” images to display on the same horizontal raster line. The 2600 has very limited number of display modes – “low-res” was something on the order of 20 blocks per scan-line, and the video-chip could do some basic modes such as "mirror right to left," “mirror top-to-bottom” without having to tie up the CPU/memory to describe the layout (which is why most every “maze” game is mirrored geometrically and very blocky). You could use this very-blocky mode to display score (such as the very early games with giant-score digits). Or, if you were clever enough to be careful with counting CPU cycles, you could use the video-chip’s “player missile graphics” to build text/digits out of the sprites, and then switch out of that sprite once the scan lines were beyond the “zone” of where you displayed text/digits. …so much for displaying a field of Scrabble tiles for players to race around and grab.

Except: the amount of RAM in this CPU was worse than “limited”. The CPU was a variant of the 6502 – with only 128 bytes of RAM. BUT, you don’t get 128 bytes. About 16 are actually controlled by the video/sound chips. Another block is the CPU’s stack. Each call to a subroutine would extend the stack’s memory use by 2 bytes, every PUSHX, PUSHA, PUSHY instruction would extend the stack by a byte.  As is typical in assembly programming, you would often push values onto the stack, jump to a subroutine, do some work, and depending on your philosophy you might POP the registers before returning, or after. And if you mixed your philosophies you’d easily blow the stack. Not a big deal right? You’d get a runtime exception of stack-corruption. Except this isn’t C code. There are no “exceptions.” You would destroy a return address on the stack, and on the return from subroutine you’d jump to an undefined location, and who knows what happens. Usually, just a hard lockup with an annoying sound on the TV set.

At this point the game probably should have been dropped. But, deals were made, documents were procured, campaigns were being formed.

Development Begins

Qualtronics, being a regional distributor for Rockwell International’s 6502, was also a distributor for their computer – the AIM-65. It was a 4k device, about the size of an APPLE-II, with a full-stroke keyboard, but, a 40 column scrolling LED display and a 40 column thermal printer. The unit wasn’t meant to be a “desktop” computer, it was more of a design-base for industrial control applications requiring a computer. As such, it was meant to be expanded in whatever application capacity would be useful. The full CPU bus was available as an expansion port. Rich started off with one of these AIM-65s as the development environment for the “Head of the Class” game.

However, it needed upgrades. First off a 3rd party dual-floppy drive system was purchased. To use the floppy drives, the case was opened, and two “expansion” ROMs were inserted into the C000-D000 sockets.  An 80 column Zenith VT-100 terminal became the 9600 baud TTY “development interface” as the AIM had no ability to drive a video monitor directly. A box of floppies was mail-ordered and would arrive sometime within a month as there was no way to retail purchase floppies in 1982. The Rockwell licensed Assembler Language was part of the package.

All that was needed was memory – every developer always needs more memory. There was no “memory expansion kit,” it had to be made. Development time was burned up handcrafting a 48k memory expansion. Data sheets were collected, schematics drawn up, a dozen or so RAM chips purchased,  and rather than wasting time having a one-off PCB board made, all the pins were manually made by wire-wrapping the interconnect of the two dozen wires per chip.

An ultra-violet lamp was purchased, and a sleeve of 27032 EPROMSs were purchased. A development target was selected – the family Atari 2600 was now commandeered to never return from Development Hell.

The Development Cycle

Typically, embedded systems are done with an “in-circuit emulator” which is a surgical replacement of the CPU with a connector that replicates all the CPU signals and branches them to an external controller so that code can be injected, debugged, and the system state can be programmed. However, no such in-circuit emulator was available for the Atari, and if it was available, it would have only been for Sunnyvale blessed developers. Instead, why not just create a cartridge, pop it in, test it and repeat until done.

Using an oscilloscope and data-sheets for the chips identified on the Atari motherboard, Rich found that all the data lines and address-lines for the CPU were brought out to the Atari cartridge port, and that in the end, the cartridge was merely a 4k ROM wired up in a standard Address bus/Data bus configuration. Meaning, just popping in another ROM was simple. The least liked family game (“Combat”) was chosen as the sacrifice. Its ROM chip was unsoldered, and a “rom socket” put in its place – this gave the ability to pop out the chip, and pop in another without needing to solder anything.

Code was written, compiled by ping-ponging source/object/binary files between the two floppy drives with a binary file that was put on to a ROM at the AIM-65 side. The ROM was popped into the surgically mutilated Combat cartridge, the Atari powered up, and the code was there to test by playing the game itself. If the game crashed, or hung, or didn’t go as planned, the only debugging tool was an oscilloscope to probe the data lines. Code would be changed, a new ROM generated, and a new cycle would begin.

EPROMs are the great-grandfather to flash-memory. They are the grand-father to “E-EPROMS.” Flash memory is permanent memory that can be changed under CPU control (like your SD cards used in phones – they don’t require power to keep the data retained). E-EPROMS are “Electrically-Erasable Programmable Read Only Memory” – which is similar in concept to Flash Memory, but to erase the memory a long tedious process was required. EPROMS, the most ancient, was “Erasable Programmable Read Only Memory” – and the only way to “erase” the data was to put the chips under intense ultra-violet light for an hour. The design of that generation of read-only memory was that all the E-EPROMS/EPROMS/ROMS were physically interchangeable. You could pop in a 2732 EPROM directly into the socket of a 4k ROM chip, thus an erasable PROM could be used directly in place of the original Combat ROM.

Programming the EPROM was nearly as slow as erasing one. To program the EPROM a “burner” board was required, and it was configured via a 9600 baud serial port. It would take 16 bytes at a time, burn them sequentially into the EPROM, and within a few seconds be ready for the next 16 bytes. Writing 4096 bytes would take several minutes. Development cycles were lengthy bouts of patience as there was no internet to be amused by while the chip was being burned.

Once it was realized that the ROM was a standard off the shelf 4k component that was directly accessible by the CPU, and all the hardware was already in place, it didn’t take much effort to wire up a cartridge receiver that could simply dump the contents of memory to disk, and copying friend’s Atari cartridges to disk was painless. Burning them back to EPROMS was painless. Having a library of dozens of Atari games, on a floppy, was a secret kind of joy known best by developers to this day.

Game development was slow, but, picked up as infrastructure code was developed around trying to understand the 2600 Design Document. The 6502 CPU ran at 1.8mhz, which was just barely enough to get things done. The electronic design of the 2600 was such that the CPU would receive an interrupt from the video circuitry saying “It is now time  to draw the screen,” and the CPU would have to set up the display mode chip, setup the player-missile-graphics, and “draw” the screen. Each assembly instruction takes up a fixed amount of time – some instructions would take 2 cycles, some would take up to 6 cycles. Each scan line of a TV takes about 55 microseconds. Each 6502 CPU cycle is approx. 1 microsecond, so on average, each assembly instruction is 4 microseconds. You don’t have a lot of time to do much once the scan line is started. Manually counting instruction cycles is required if you want to switch things up on the display, as its being drawn. Once the display is drawn, the interrupt is cleared, and the CPU code returns to ‘whatever it was doing before.’

In the case of Atari games, this “vertical blank time” is when scores are updated, enemies are moved, etc.

Finished game released to Manufacturing

Atari games are ROMs. An EPROM at the time was probably two or three dollars each. Whereas a ROM could be as cheap as a few cents each – if you had hundreds of thousands made. Making a ROM was a special process involving microscopic photography. When code was considered “finished” and ready for a ROM, the raw binary content was given to a ROM manufacturer. The manufacturer would make a series of photo-masks that represented, in literal physical terms, the 1’s and 0’s of the raw binary content. The raw “ROM Blank” was a slab of photo-reactive silicon alloys that, with a repeated series of photo-exposures and acid washes, would physically burn the data onto the chip. Once created it could not be changed. Once the photo-mask was created for a “run” it could not be changed. One incorrect bit, and the entire process would have to be repeated from the start. The ROM manufacturers weren’t interested in 100 units. Not even 5000 units was mutually worth the money. 10,000? Now you’re talking. 100,000? That’s where the per-unit costs get really cheap. It is best to buy in 100,000 unit quantities.

There was a long lead time on getting the finished binaries to the ROM manufacturers, the plastic cases ordered, the packaging created, and the marketing efforts ramped up. Much effort was put into play for the anticipated holiday rush. This was going to be Selchow&Righter’s splash back into Modern Family Life. Even if the game wasn’t as exciting as originally planned. Even if the game was actually not very much fun. It didn’t matter – the public wanted, and loved, all things Atari.

“Head of the Class” was developed and tested on the family 2600, played on neighbors 2600s to judge feedback and to verify it worked. The marketing folks long since dropped the “Head of the Class” name in favor of “Glib.” Delivered on time, the binaries made their way to the factories; cases were sourced; stickers, labels, and manuals were printed. It all came together and arrived on store shelves as planned. Tens of thousands of them all over the US in Sears, Kmart and Crazy Eddie.

Then the problems came in. It seemed that too many people simply couldn’t get the game to work – no matter how forcefully they blew into the cartridge. Kids across America were being scolded, “Did you put that tape near a magnet Johnny?! How many times did I tell you to be careful with them tapes!” No, it wasn’t a tape issue. No, it wasn’t a dust issue. It was a timing issue.

As it turned out, the “Xerox sub set of xeroxed design documents” was intended for European televisions. Recall how the CPU had to carefully count instruction cycles once the display interrupt started – being off by a few instructions would be wrong. The requirement that the CPU be tightly bound to the video-chip meant you had to be really careful on what you did, when, and for how long. Too many people (10% ? 20%? Something like that) were finding the display to be unstable, or rolling, or even just “freaking out” on their television sets. Technically, the timing was right on the edge of what would work on US televisions. Those televisions with the ability to handle overscan (which was new at the time) were the unexpected problem.

There was no way to deliver a “patch.” There was no way to “reprogram” the games. The only way to fix the situation was to create an entire new ROM, which meant another 100,000 ROMS. The customers wanted their money back. And the game wasn’t much fun anyway.

Realizing there wasn’t a financially viable avenue out of this, Selchow&Righter accepted returns. Only the returns weren’t from the 10 or 20% of the affected customers, they were the entire inventory from Sears, Kmart’s and Crazy Eddie’s. All the useless inventory was collected in a warehouse, written off as a loss, and never heard from again.

Timing is everything.

--

The Author: I’m Mike Montana, son of Rich Montana the developer. I was 14 at the time, and my father took me on as his apprentice and Chief Coffee Fetcher. He dumped the grunt-work of generating the hex-encoding of the player missile graphics to me, writing some small code (BCD math in assembly), and pretty much doing the drudgery he did not want to do - for which I thought I was the Coolest Kid in the World.  I learned how to do programming, soldering, debugging and how to copy Atari cartridges in this experience. This put me on the path of being the developer I am today some 35 years later.



2 comments:

  1. "The least liked family game (“Combat”) was chosen as the sacrifice. "
    Combat is a 2K game and the pcb can't be used for 4K games b/c it's missing an edge connector.

    ReplyDelete
    Replies
    1. Good point Scott, thanks for asking. I asked Mike about it, he said this: "Eagle eyed fan base you have! Impressive. Yes the Combat was a 2kb and the PCB simply would not support a 4kb ROM. I dont recall when the development outgrew a 2kb EPROM but the intent was to deliver a 2kb game - it would have been cheaper to mfg. When ever it outgrew the limit, some other game was used - I dont recall when/which-one. But Combat was never seen again."

      Delete