Classic personal computers, and the advantages and disadvantages to becoming a hacker/guru of them

Classic personal computers, and the advantages and disadvantages to becoming a hacker/guru of them

If you want to truly master a computer, that computer can't be too big. At some point, every user of modern computers must come to the basic realization that no single human being can actually understand a modern-day PC. There's just too much to them. Even if the BIOS code and operating system code for the machine were open-source (which is typically not the case), there would just be too much data there for a person to read in a single lifetime. If your operating system is 200 megabytes in size, and supposing that you live to be 80 years old (29,200 days, not counting the extra day in leap years), you'd need to learn about 6,849 bytes of instructions every single day of your life to learn the whole thing. Even if you pulled this off (which would give you a little over 12 and a half seconds to understand each byte, assuming you didn't waste time eating or sleeping), there would be no time left in your life to actually use that knowledge. Plus, many operating systems are actually way past 200 megabytes now; some are into the gigabyte range. And that's just the operating system, never mind the circuitry that makes up the computer's hardware. You reach a certain point where "daunting task" turns into "task beyond the limits of a human being".

Many people who seek to learn about computers become faced with this exact problem. (More specifically, anybody who seeks to *truly* understand computers does; the people who don't are the ones who are content to believe that being able to install a modem or change a configuration file means they know all about computers. Not to disrespect such knowledge; those are useful skills, but being able to do those things does not mean that you actually understand the fundamental ways in which computers work.) Many people deal with this problem by turning to the classics: The computers of bygone decades, machines that were actually small and simple enough that a single human mind could literally know and understand everything about the hardware and software of the machine. A computer with only 64 kilobytes of RAM might seem pathetically tiny to a typical PC user today, but the reality is that if you actually were to print out the entire 64K memory space of that computer and read through it, it would take you a long time to do so. To actually know what every single one of those 65,536 bytes does would take months, if not years. A full printout and commentary of all that code would fill a phone book. 64K doesn't seem so small now, does it?

Although these classic computers provide one approach to a computer that a person can understand, the advances of modern technology have also created a second, perhaps even more exciting possibility: The prospect of creating one's own computer, from scratch. Today, "reconfigurable computing" is a significant buzzword, and many people have literally designed a CPU and all the other supporting circuitry and code needed to make a real computer. FPGAs and other forms of programmable logic usually figure prominently in such efforts. The result is a computer whose architecture you can truly understand and control, since you built it yourself. However, this web page is not about that; there are already other pages (including some on this very site) about that. Instead, this page is about the first alternative: Going back to already-made microcomputers that have already built a significant base of software and culture around themselves.

If you're going to take the time and trouble to learn all about somebody else's computer design, you probably want it to be a pretty *good* computer design. As such, this page endeavors to list the major pros and cons of the most significant microcomputers in history, such that you, the aspiring hacker guru, may discern what system best fits your style. Although I usually try to avoid bowing to popularity, on this page the relative popularity of each system (usually considered from a North American perspective) factors into the picture, simply because if you're going to go all-out and learn about these systems, you might as well also form connections with people who share your interest of that computer. Although some people believe that computers are isolating and make for lonely people, early microcomputers actually caused people to group together for the purpose of sharing software and user tricks for their microcomputer of choice. You can certainly hack on your own without sharing the experience with anyone, but like many other hobbies, you may eventually desire to share yours.

Apple II


Open and accessible.

The Apple II was designed by Steve Wozniak, a hacker in the truest sense. Not surprisingly, therefore, the system has an architecture that is famously accomodating to other hackers. Hardware schematics for the entire machine are readily available, and the machine's ROM BASIC is short on the cruft that designers often pack into their ROMs, but contains several useful subroutines that make machine-language programming easier. The architecture of the Apple II ensured that a significant coder community got built around it in spite of its audiovisual shortcomings.


Graphically and acoustically, a real turkey.

In terms of both its graphics and its sound, the Apple II computers are by far the worst of any major microcomputer of the 1980s. (The only exception is the Appple IIgs, which is discussed separately on this page.) The Apple IIs are capable of proper RGB video output, and if you plug them into a television using the RF video interface, you can actually get a pretty decent screen, but their default monitors were typically monochrome, and the color monitors were even worse, because they were usually composite color monitors which produced horribly ugly, broken colors.

The Apple II computers' sound was also minimal. Their only actual sound function was to make the speaker click; programmers could (and did) "click" the speaker several times in rapid succession to produce tones and sounds, but doing so was a rather non-intuitive art that left something to be desired.


The Apple II is for the true hacker who likes to code and doesn't care how pretty their machine is. You can do some neat stuff with the Apple II, but passers-by won't know it.

Apple IIgs


Brilliant, sophisticated graphics and sounds.

The Apple IIgs was a giant leap over the other Apple II machines in both graphics and sound. This was so important a step for the Apple II family that the "gs" in "IIgs" literally stands for "graphics" and "sound". Graphically, the IIgs came with a proper RGB monitor that produced proper 256-color graphics without the ugly color clashing of composite graphics. Acoustically, the IIgs came with a sampling synthesizer chip that remains legendary within the computer industry to this day.

Backwards-compatible with prior Apple II stuff.

In a burst of sheer engineering ingenuity, Apple made an ASIC called the Mega II, which packed virtually the entire functionality of an Apple IIe onto one chip, and incorporated the Mega II into the motherboard for the IIgs. The result is a historic and rather unique example of hardware-level system-within-a-system emulation, designed to ensure that programs for earlier Apple II machines could run natively on the IIgs. For the most part, it works; the Apple IIgs really is backwards-compatible with most of the software for its other Apple II siblings.


Minimal user base.

The Apple IIgs was nothing less than a revolution for Apple. Unfortunately, Apple chose to undermine and abort this revolution, for one simple but very bad reason: They wanted to hype the Macintosh instead. Apple believed that the mouse-based GUI was the future (sadly, in retrospect, they were right), and although the Apple IIgs had a separate GUI interface you could run on it, Apple preferred to market a cute-and-friendly machine like the Macintosh that was designed to be GUI from the ground up. The result of all this is that although the Apple IIgs was well received by most of the people who bought it, its life in the spotlight was short-lived, and it did not have time to develop either a significant software base or a user fan base. Only a handful of really good games that actually took advantage of the IIgs' capabilities ever came out; similarly, a few people learned the deep secrets of the IIgs (and those people typically swear by the IIgs even to this day), but those people are few and far between, making for a rather thin Apple IIgs community.

Exceedingly complicated sound interface.

With great power sometimes comes great complexity. The sampling sound interface used in the IIgs wasn't simple to use. It took a significant amount of set-up before you could get it to make real sounds. The advantage is that if you want to make some really spectacular sounds (beyond the limits of what computers sould have been capable of in the mid-1980s), you can with the IIgs. The downside is that it's not as easy as it should be to do so.


The Apple IIgs is the machine for the starving artist, the person who truly wants to create engaging, brilliant art and doesn't care whether it sells or not.

Commodore 64


Revolutionary sound system.

Unlike the Apple IIgs, which used a sampling sound synthesizer (meaning it required actual digitized sound samples), the Commodore 64 used the now-legendary 6581 SID (Sound Interface Device), a regular synthesizer chip. It had oscillators, a filter, and attack/decay/sustain/release controls, just like a professional synth. The result was an exceedingly easy-to-use sound system that could be quickly interfaced to, and that actually made passably decent sound. Admittedly, the sound produced by the C64's SID wasn't nearly on the level of quality that you could achieve with the Apple IIgs, but the Commodore 64 made its sound system a lot more accessible. Many people actually prefer the sound of the SID because of its nostalgic effect, although its bloopy-bleepy sound is probably an acquired taste.

Colorful graphics.

The Commodore 64's graphics system isn't as fondly-remembered as its sound chip, but it had several different video modes which gave you some options in terms of video. The C64 was somewhat hampered by low-resolution graphics (going to higher resolutions required sacking the color for monochrome), but those slightly-chunky-but-colorful graphics could be made to look positively appealing in the right hands.


Lack of high-resolution graphics.

As just mentioned, the C64's color graphics modes were low-resolution, making the platform forever famous for blocky, chunky characters and landscapes.

Terribly slow data transfer.

The disk protocols used on the Commodore 64 are insanely, abominably slow. It takes a simply unreasonable amount of time to load anything on it. This isn't too bad if you can put up with setting a program to load and then going out to get lunch while you wait for the program to run, but in general, it's absolutely painful to wait for the machine to load any full-sized program.


The Commodore 64 is the everyman's computer. Although it suffers from a few minor idiosyncrasies and shortcomings, it is by and large a solid, enjoyable computer with an enormous fan community.

Commodore Amiga


Brilliant audio/visual presentation.

Amiga users *STILL* love to boast about how the Amiga had everything before any other platform did. Although you might hate to admit it since it only encourages them, the truth is that they're right: The Amiga had 32-color graphics and stereo, multichannel sound before any other major platform. Its capabilities were way, way ahead of its time, and remain beautiful even today.

No need for a proprietary external power supply.

Unlike the Commodore 64, which uses a weird external power supply that actually supplies two different voltages (one AC, one DC!) through an unusual 7-pin DIN connector, the Amiga has its own internal power suply. The only external power part you need is a normal power cord, just like the ones that plug into PCs today.


No internal user interface.

Unlike the Commodore 64, which has a BASIC interpreter built into its ROM (through which you can, at least in theory, do anything with the computer through a series of PEEKs and POKEs), the Amiga was never capable of doing anything without disk-based software; it won't boot into anything usable from firmware alone (unless you count a still image of a hand holding a floppy disk which is meant to tell you to pop a disk into the drive). In this respect, it's much like the IBM PC. While this isn't a terribly big deal if you happen to have a boot disk on you, it does become a pain when you just have the computer and no software for it, and find that you can't use it. Note that the original Amiga 1000 actually needed *TWO* boot disks; the first was Kickstart, the initial bootup code that was included in a ROM chip on all later Amiga models. Once you got Kickstart loaded, all Amiga models require some software to do anything. This was classically Workbench, a GUI interface not unlike the Apple Finder or Windows and its ilk, but many programs were written such that they didn't need Workbench to run.

Not as popular as the Commodore 64.

The Amiga lacks as significant a user base as other major microcomputers. While it's not that hard to find devoted Amiga people, they tend to adore their Amigas with an enthusiasm that borders on the fanatical, which is not unusual for a small group that knows they're right but can't seem to get the world to agree with them.


The Amiga is the computer for those who demand the absolute elite; just as these people will spend 10 times as much money for a car that's only slightly better than the competition, so the Amiga user will stop at nothing to get this jewel of an underdog that consistently outperforms everything it's up against.



Widespread today, making hardware hacks broadly applicable to almost any computer.

It's not hard to find a PC. It's important to understand that the PC architecture defines a certain standard that PCs have always had to adhere to in order to ensure software compatibility; what this means is that many PC hardware and software hacks which were effective more than 20 years ago will still work perfectly on any common modern-day PC. There is a great advantage to this kind of accessibility, being able to practice your art almost anywhere you go, that you simply can't achieve by being, say, a Commodore 64 guru; even once you've mastered the C64, where are you going to make use of this skill once you go out into the world?


It's not very "special".

PCs today are so powerful that "demo scene" coding is almost irrelevant now; you can make impressive graphics and sound through a well-established regimen of standard procedures. That's still kind of cool, but it doesn't really make you a "hacker" if you do it; it just makes you another coder.

Wide variety of configurations means not all hacks are applicable.

The "standard" configuration for the PC has changed significantly over the years. For example, in the early 1990s it was likely that most PCs you ran into would have an AdLib sound card, or a Sound Blaster card (which was AdLib-compatible), so if you knew how to write AdLib routines in assembler, you could quickly start doing FM synthesis with almost any PC you ran into. Today, such tricks aren't as applicable, since the AdLib standard has been largely forgotten and little programming is done in real mode anymore.


The IBM PC is the computer for the scientific programmer, who wants to write math-intensive software that can make good use of the faster CPUs of today's PC. The IBM PC is also the computer for the populist, the one who wishes to rally around the standing de facto standard as chosen by society.



Solid machine; good graphics, good sound, good CPU, good architecture.

The MSX was a computer of legendary popularity the world over, and for good reason: It was just a great little machine. It didn't have any one particular strength that stood out, but it just had good graphics, good sound, and a good internal structure that made it fairly hacker-friendly while still being usable and accessible.


Virtually unknown in North America.

For some bizarre reason, the MSX was a household name in most of Europe, Asia, and even Africa and South America... Yet a majority of North Americans have never even heard of it. Travel anywhere else in the world and you'll readily find hordes of people who fondly remember this computer, but if you live in North America, you'll have a hard time finding people to join you in your MSX hobby.


The MSX is the computer for the globalist.

ZX Spectrum


Strong, dedicated fanbase in Europe.

The Spectrum sure gets a lot of love in Europe. It's certainly a neat little machine, with its best feature being its innermost guts: A well-designed ROM BASIC system.


Ugly color scheme.

People still talk about what a great machine the Spectrum was, and in many ways it was ahead of its time, but almost any person not acquainted with the Spectrum becomes instantly and rudely awakened to its greatest weakness when they see any program written for the machine: The ZX Spectrum has absolutely horrible color.

The problem here is the way the ZX Spectrum's screen colors are mapped out. The screen's color map is broken up into a 32x24 grid (i.e. the whole screen is turned into a grid of 32 columns and 24 rows); on each square of this grid, you can store two color values. This means that in any block on the screen, you cannot have more than two colors, which severely limits how many objects can realistically intersect in one area on the screen. Many programs worked around this problem by being entirely black-and-white so colors didn't factor into the equation at all. Others simply accepted the problem and went ahead with weird color discontinuities in which objects would spontaneously pass through jarring color changes as they moved around the screen. Most programs, however, simply dealt with this problem by keeping objects far apart so that they didn't intrude on each other's color space. This didn't bode well for games, in which objects often need to exist near each other.

The ugly effect of incompatibly-colored graphics trying to coexist on the screen of a ZX Spectrum was and is famously known as "attribute clash". Although some people see it as an endearing nuance of the Spectrum's personality, others see it as so serious a problem as to disregard the Spectrum as a possible development platform.

Little user base in North America.

Most of the North Americans with fond memories of the Spectrum seem to be Europeans whose fond memories are from their home countries.


The ZX Spectrum is the computer for people who had one before, and want to have one again.

There is, of course, much more information on all of these platforms online. One of the most entertaining (if not the most informative) articles I've read on the endless platform debate is ZX Spectrum: Absolutely Better Than Commodore 64, an allegedly true story about a contest held, as you can guess, between a group of C64 loyalists and a group of devoted ZX Spectrum users. Right from the title, you can probably imagine that the article is incredibly biased, and indeed it is; the purportedly neutral judge who was chosen to preside over this contest also seems to have a strong bias toward the ZX Spectrum, as he awards it points for completely ridiculous things; at one point, he awards points to the Spectrum for having "real colors" because the Spectrum's shade of red is nicer, despite the Spectrum's crippling color-clash problem that makes most of its programs look like a bad LSD experience. Neither team really seems to have any idea about what really goes into a computer, either, as they claim platform advantages for negligible things, like the C64 having built-in playing card symbols (clubs, diamonds, hearts, and spades) so "we have them ready for use, so we save a lot of hard work". Neither team actually represents the real strengths of their chosen platforms effectively, so the article actually isn't that valuable from a technical perspective, but it seems to provide a lot of insight into how microcomputer enthusiasts thought about computers back in the 1980s.

Back to the main page