The History and Direction of the Computer - A Personal View, Part 3

Jim Kuzma returns to continue his look back through the years of computing

By Jim Kuzma, Contributing Writer, KOOZ@CompuServe.COM

(...continued)

I can still remember the day it arrived. Two heavy boxes with the familiar big blue striated IBM logos on the sides. The XT, Intel, and Microsoft had penetrated my life to make it forever more miserable than could have been imagined. The first ten minutes were fun.

Just imagine, 10 whole megabytes of screaming fast hard disk space, more than could possibly be used. A RAM address space of up to 640K, 20 times the room of the CP/M machine. A bucky keyboard with function keys. A monitor plugged directly in, not a terminal through a slow serial port. APL ran on it. Wow.

The installation manual was two inches thick to tell us how to plug in the keyboard and monitor. It treated us like morons. Welcome to IBM lingo where boards were "adaptors" and every other page had the same picture of a mindless geek looking at God's gift to humanity, the personal computer.

We held our breath when we switched the power to the "1" position. What were these slow, sequencing green numbers, and when would they end? A whole screen needed to display "640 KB OK". Kilobytes would have been too much information to let the uninitiates know that RAM was being checked. Like it mattered. Like we could stop it. Little did we know that that 70-second boot procedure would end up wasting an entire season of our lives cumulatively added over every time we had to reboot, which was as often as usual.

Dir left a blur of afterimages with phosphors that had a 20-second visual persistence, probably to give us the illusion of speed. The keyboard could be heard two blocks away, and the delete key was loudest. But hey, it was an IBM. Built like a tank, and oh, that legendary customer support.

The manuals seemed to be filled with information, and they were. Too bad most of it was erroneous or superfluous. DOS had an interesting tree structure for directories, and the rest of it felt like CPM, right at home. Except, most of the standard utilities had only half of the features of their equivalents in CP/M, take DDT versus DEBUG, or ED versus EDLIN. Where was the symbolic debugger or the macro commands in the editor?

The nightmares began when we started programming it. Segment registers were 16 bits, multiplied by 16 and added to the index registers to generate the pointer to the physical memory address, which meant that two tests instead of one were needed to check if you were out of bounds. Worse than that, more than one combination of both could point to the same location. Most languages, including the data area of the provided BASIC were still limited to 64K making us wonder why any sane person would design this P.O.S. processor. (Piece O' Sh..). Little did we know just how many bugs, how many limitations, how many curses would inevitably result from this garbage architecture. That was Intel's fault. They could have learned a little from Motorola. The 68000 was a contemporary processor with heaven for an instruction set, not crap like "REPE MOVSW", a real brain-numbing mnemonic.

It wasn't all Intel's fault, though. Between IBM's lack of documentation and support and Microsoft's penchant for releasing sloppy software, this machine was poised to change the world for the worse. When Big Blue made a move, the whole world paid attention, like someone walking directly behind an elephant, wary of the position of the tail. If you wanted to market a design for it, called up IBM and asked how, some cryptic bureaucrat would ask if you were a VAR. What's that, a variable? No, that's IBMspeak for value-added retailer. They weren't interested in development. That was evident when they kissed their PC goodbye when the world showed them how it was done. In a collosal faux pas, they tried to show us that they knew best with a whole new hardware package around the same stupid machine with the PS/2 (yeah, Piece...).

But, the PC's ran Lotus and WordPerfect and DBase, so the business world was scarfing them up like popcorn. By sheer volume alone, the popularity was sealed and delivered. For a few grand, you got a work station immune to the crashes and bottlenecks of the office mainframe. At least you could get some work done. A person became a guru just by having successfully installed a hard drive or mouse or RAM upgrade or managed to resurrect a machine by pushing on RAM chips in the sockets or low-level formatting a hard drive.

Anyway, we used the XT's and AT's as soon as they came out for printed circuit CAD design and real-time audio signal processing in speech recognition. We were the first to show off 16-bit, 50-kilohertz stereo direct-to-disk continuous audio sampling and editing on an XT at the 79th Audio Engineering Society Convention at the Hilton in New York City in 1985.

I got involved in warehouse robots, known in the industry as AGV's, or Automated Guided Vehicles. These were golf cart-size monsters capable of lifting a few tons and wheeling it around the warehouse by its own control. They had XT's for brains, and I felt at home on the machine. It was my first taste of multitasking. We wrote everything from the BIOS chip to the kernel to dozens of tasks simutaneously running. Safety, diagnostics, traffic control, steering, you name it. I'll never forget the day one of the software engineers wondered what the safety task did, removed it, and watched the vehicle veer off the path and plow through the wall into the software department. In the inside of the office, the AGV had poked in, blinking its headlights at the development engineers. Safety was re-installed.

Computer Aided Design has evolved nicely, making engineering less torturous and tedious, and allowing more time to be devoted to the creative process instead of the drudgery. That is what computers were supposed to have done.

It seemed that as time went by and cludges like expanded and extended memory were actually slowing things down, making things worse, making code three times more complicated to write and resulting in larger and larger programs, the response was not to find a better structure, but to maintain compatibility, hauling around anachronisms like the program segment prefix and the disk partition limitations. The only way to better productivity was to buy a faster machine with more memory. So now, to write a text document with any hope of doing it as fast as on an XT with a simple full-screen editor, you need a Pentium processor and 8 Meg of RAM. This is progress? Arguably, the GUI revolution of Windows on the PC was a belated and poor attempt at copying and achieved far less than what the Mac and the Amiga had done years earlier. The evolution towards fat and inefficient operating systems has clearly made things worse. Software has bloated so much that no one individual can truly know what is happening in his own machine anymore, let alone find out why something isn't working except for stabbing around, trying this and trying that and settling for the results.

I refused to get caught up in this junk, and saw it coming at the beginning of this decade. I moved instead towards the micro revolution in embedded controllers. There, you still had the whole machine under control and could fathom the operation of it. There, compactness of code and elegance of design were still qualities appreciated in software because of the limited resources. There, it was still an eight-bit world and assembler was the language of choice and necessity. There, Microsoft had no influence.

(to be continued...)