Foolish things people think about computers

Foolish things people think about computers

As is often observed, people sometimes do silly things with computers. Well, that may be true, but the fact is that actions usually begin as thoughts. Rarely does the average person commit an act without some degree of forethought (even if the relative amount of that forethought is somewhat small).

The root problem is not so much what people do, but rather, how people think. If you wish to change behavior, you must being by changing thought patterns. To that end, this page is devoted to observing some of the most foolish and dangerous ideologies I've seen surface in the computer industry which colletively could lead to any number of problems for users, developers, and engineers.


The main reason why computers so often fail to live up to expectations is because of a fundamental misunderstanding of what computers are supposed to do. People rarely seem to truly grasp the function behind a computer. This is bad enough when end users exhibit this shortfall of understanding, but when developers suffer the same lackings, it negatively impacts the entire industry.

One of the largest debates that has furiously raged on in the computer arena is over what operating system people ought to use. While there are various reasons to use various operating systems, the main reason why people choose their OS is simply because of software support; users want to run applications, and they pick the OS that is most likely to run their apps of choice.

Of course, it is well-understood by now that the best operating system is the one you program yourself, because then you can customize it to suit your own needs in any manner you wish. However, for various reasons, people do sometimes end up using ready-made operating systems that have been pre-programmed by other people. This allows people, in turn, to use ready-made software that has been pre-compiled for that operating system, which can be very convenient, since it saves the user from having to write their own application.

Unfortunately, in our current computing industry, people are often encouraged to use the Microsoft Windows operating system, an environment which is patently useless because it does not effectively utilize the key applications for the personal computer. People speak of being able to run word processors, spreadsheets, and database servers on their computers, which serves to demonstrate a fundamental failure to grasp the purpose behind a computer. As a quick reality check, number the times you have heard a colleague say "Hooray! I can run a spreadsheet application on my computer! I am ready to die now because my life has been fulfilled!" If, like most people, you come up with the number zero, perhaps this points to a need for a fundamental shift in the ideology applied to a computer's application set.

The truth is that most good computer programs on the Intel 80x86 platform are written for MS-DOS. As a quick reality check, I opted to make a list of the most important mission-critical business applications I use on a daily basis. The list looks something like this:

Aces Of The Pacific
Aces Over Europe
Chuck Yeager's Air Combat
Deus Ex
Doom
F-15 Strike Eagle II
F-15 Strike Eagle III
F-19 Stealth Fighter
Half-Life
King's Quest I
King's Quest II
King's Quest III
King's Quest IV
King's Quest V
King's Quest VI
Leisure Suit Larry I
Leisure Suit Larry II
Leisure Suit Larry III
Leisure Suit Larry V
Leisure Suit Larry VI
Lemmings
Lemmings 2: The Tribes
Maniac Mansion
Sam And Max Hit The Road
SimCity
SimCity 2000
Space Quest I
Space Quest II
Space Quest III
Space Quest IV
Space Quest V
Strike Commander
Stunts
Syndicate
System Shock
The Incredible Machine
The Even More Incredible Machine
The Incredible Machine 2
Vette!
Wing Commander
Wing Commander II
Wolfenstein 3D

Again, this is only a cursory list, denoting the key applications that I use on a daily basis and which spring readily to mind; I have not even listed the more specialized utilities like The Horde, a very good Hordling-removal tool, or the scores of other programs which make the PC the broadly-applicable platform it is.

It should be readily apparent, however, upon looking over this list of about 50 programs, that only 2--Deus Ex and Half-Life--are Windows programs. The rest run under MS-DOS, which clearly establishes MS-DOS as the dominant platform in use on the PC today. It is surprising, then, that Microsoft has not marketed or produced any updates to MS-DOS since the 1990s. This lack of initiative on Microsoft's part has led many of their loyal customers to some of their less compatible operating systems like Windows XP and the forthcoming Windows Vista.

All of this is the clear result of foolhardy decisions on the part of developers, businesspeople, and users. An entire generation of computer software could be improved with a little education among these groups of people. Yet people insistently continue to use store-bought, off-the-shelf software. No wonder the computer industry is in such a mess.


There was a time, not that long ago, when computer sound cards would be configured through jumpers that were actually mechanically moved on the surface of the board. For example, if you wanted to switch your sound card to, say, IRQ 2, you could do so very easily by moving a black plastic block called a "jumper" to the IRQ2 setting on the sound card. That was it; it was as simple as that to set your sound card's IRQ. The DMA channel and I/O address used by the card would be set using this same technique with other sets of jumpers on the board.

As time went on, however, it became trendy to have the card configurable through software. Rather than having to physically access the card, one could simply run a configuration utility that would access the card and set the card's settings through software. The card itself would usually be entirely devoid of any jumpers. While this was a cute convenience, it also raised problems sometimes because the software to set the card's settings would have to be supplied by the card's manufacturer, and they would only develop the software for specific operating systems; if you used a different operating system than the ones they targeted, you were pretty much out of luck unless you could get the card to work with its default settings.

In even more recent times, however, it has apparently become standard to do away with the practice of having people set their sound card settings altogether. Rather than allowing the user to choose their settings, the sound card is simply auto-configured through some software driver. Again, this is a neat idea that often works well, but what's disturbing about this trend is that most of these auto-configurable sound cards don't even offer the option to configure them manually anymore.

I cannot understand the reasoning behind taking away the users' control over their own machine. This has happened time and time again in the computer industry, and every time it happens, the users applaud their loss of freedom. They would rather have everything done for them automatically than be able to make their own decisions.

This is particularly galling in light of how simple this particular act would be. As noted earlier, setting one's sound card settings is about as simple an act as any human could imagine. It is accomplished by moving a mechanical piece of plastic to effect a change in electrical connectivity; it is an act literally requiring no higher degree of knowledge or technical sophistication than flipping a light switch. How do these people drive? How can such a person function on an everyday level, without the knowledge or coordination required to move a small object?

Auto-configured sound cards are only usable for those who use ready-made operating systems (since the card's driver usually is only made for a specific OS). This means that such cards are useless to anyone who makes their own operating system.

Furthermore, auto-configured sound cards sometimes change their settings. Since the IRQ, DMA channel, and I/O address are out of the user's hands, they may change if the despotic "plug-and-play" routines see fit to change them. Imagine the shock and trauma users experience when they discover that the configuration they had built on as a foundation has suddenly been lifted out from under them. Your sound card's IRQ is something to cherish, something to be tattooed on your arm as a display of love, not something to switch around from one day to another. Yet developers and manufacturers continue to engage in such abusive configuration practices. How can these people sleep at night?

I fear deeply for the future of our society's audio devices. We may be entering a future in which users quite simply don't know what IRQ their sound card is set to. Perhaps it is time to take public action now; just as a well-known series of radio announcements in times not too far distant asked parents if they knew where their children are, perhaps it is time to run a string of public service announcements which earnestly ask the computing public: Do you know what your sound card's IRQ is?


Even after the technology industry has made many years of progress, it remains frustrating how many people have not adopted--and indeed, in many cases, have actively shunned--technology in their homes and lives. The interiors of most houses continue to focus on activities relating to food (cooking and eating), washing and storing the human body (bathrooms and bedrooms), and the like. Yet even modern homes in developed countries lack significant integration of electronic technology.

As a quick example of the failings of the modern home, if you took a survey of households in almost every country in the world, the percentage of households containing a signal generator would be less than 20%! Less than one-fifth of families have basic electronics available, meaning that when the need arises to generate a square wave, they simply have no good way to do so.

Ken Olsen, President, Chairman, and founder of the legendary Digital Equipment Corporation (DEC) became famous for his quote "There is no reason anyone would want a computer in their home." That quote has become illustrative of the fact that people often fail to realize how useful key components of technology are. Although this quote frequently gets circulated to remind folks that maybe people are more willing to adopt technology than you think, people still cling to their outdated, myopic ideas of how technology relates to human society: There is no reason anyone would want an oscilloscope in their home. There is no reason anyone would want a reflow oven in their home. There is no reason anyone would want an oil refinery in their home. There is no reason anyone would want a logic analyzer in their home. The reality, of course, is that all of these devices have become fundamental components of the modern human existence, and are as basic to a household as a toaster oven or a refrigerator.

Those without basic tools of living in their homes continue to claim that such electronics have little application in the household, which shows that these people are simply not aware of what they are talking about. The reality is that an oscilloscope is one of the most fundamental pieces of test equipment possible, displaying the voltage waveform on a circuit as a function of time. The oscilloscope has applications for virtually any circuit imaginable; it is an eminently useful and applicable device. By not exercising their ability to view waveforms, people are willfully giving up their fundamental freedom to do so.

Nor is the problem that people are "afraid" of technology in their homes, as evidenced by two very notable electronic devices which have seen significant market penetration: The telephone and the television. The reality, of course, is that both of these devices are nearly useless today, as both of their functions can be duplicated for free with an Internet-connected computer; computers can easily download audio/visual content that replicates the functionality of a television, and voice-over-Internet programs can be freely used, making it possible to merge the functionality of a television and a telephone into a cheap Internet connection. Yet people continue to use these overpriced, obsolete devices. I have long ago thrown the useless telephone and television from my home and get all their functionality via the Internet, yet they continue to occupy prominent positions in most other homes.

The oscilloscope is arguably second only to the computer as the most important device in the home, as there's simply so much you can do with it. Half of the people I know have an oscilloscope at home; as a quick check, I called my contacts to see if they had one, and sure enough, one did. The other person I know claims they do not have an oscilloscope at home, but upon further questioning, it was revelealed that this person not only lacked a logic analyzer as well, they *actually were not using a computer at the time that I called*. This person was awake and alert, yet was doing something other than operating an electronic device. (!) Can such people actually exist in our world? It truly boggles my mind to think that there might be someone so terribly out of touch with reality. I can only hope that this person was pulling my leg and that they were in fact watching their CPU's clock cycles on their oscilloscope when I called; if indeed what they said was true, then no wonder they don't have a logic analyzer, for they surely are not human, but sad, lonely entities lacking the companionship and joy that an oscilloscope brings to the home.

Although the ideal home has an oscilloscope, a spectrum analyzer, a logic analyzer, a solder reflow oven, and a chip fab, it might be overextending to try and get everyone to welcome all these devices in their homes all at once. I therefore propose that we begin with the simplest and usually least financially expensive of these items: The venerable oscilloscope. What our society needs is a public-awareness campaign encouraging folks to improve their quality of life with slogans like "Adopt a homeless oscilloscope today!", or "Don't mope! Get an oscilloscope!" Ultimately, of course, this campaign would culminate in slogans like "Be rad! Own a chip fab!" This particular slogan, of course, does not quite rhyme, but this could be an added benefit, as it could encourage the general populace to develop an articial-intelligence-on-a-chip system that would be smart enough to correct this slogan into one that actually does rhyme, through application of fields of knowledge like natural language processing.

I visited a person's home once, and was shocked to see empty floor space in their living room. In the midst of a circle of chairs was a vast gaping mass of bare floor, large enough to hold an entire computer. Yet the space held nothing. What a terrible waste! You could have fit an entire storage rack containing every standard value of resistor and capacitor in that space, with room to spare! In today's world, where real estate space is at a premium, such people should have their homes taken away from them so that the space can be more effectively used to store computers and electronics. I was so offended when I saw the disgraceful state of this person's home that I am not sure how I can ever speak to them again, even though they are a dear friend. Such terrible negligence by any human being is truly a sign of a society with deep social ills.

People sometimes mistakenly see houses, apartments, and condominiums as being for humans. This is of course a delusion; living spaces are for the storage of computers and electronics. Without computers and electronics, people could quite happily live outdoors and sleep in tents and eat berries off trees all day. But electronics require shelter from rain and they often need temperature-controlled environments, and this is why mankind builds buildings. Getting to sleep inside is a perk, but I still encourage everyone to make the right decision and fill all available volume in your living space with electronic devices. It provides for that "lived in" look, and when people ask how your children are doing, you will be able to honestly say that your third transistor curve tracer is snug and warm in its storage cabinet. Ah, what a family to warm the heart.


I am really sick and tired of people misapplying the word "obsolete" to computers. This word means different things to different people, but fundamentally, it means that something has been replaced with a newer, more capable device. While this sort of development happens constantly in the computer industry, people have a habit of grossly misconstruing "obsolete" as meaning "useless". This usage is completely inappropriate and built on the massive pit of pretentiousness that still festers underneath much of the computer industry.

The simple fact is that a computer never becomes useless. Think of a home computer from the 1980s, and compare it to a home computer of the 2000s. Think about the ads that ran for comparable computers in magazines during these time frames. What were 1980s computers advertised as being capable of? Typically they were marketed as good for word processing, processing spreadsheets, and playing games. Today, a typical overinflated snob might look at such a computer and pronounce it a worthless piece of junk because it wasn't made to go on the Internet or run the latest operating systems. While it's true that a computer from, say, 1985 probably won't run Windows XP, it will still do everything that it was made to do back then. You can still do word processing and desktop publishing with it. It is a fully functional machine capable of serving many useful purposes. You could probably even make it connect to the Internet to check e-mail and browse the Web; you wouldn't use the latest version of Windows to do it, but a much more efficient TCP/IP stack, along with a basic e-mail client and Web browser, could be written for the machine. Witness the fact that the Commodore 64, a computer first released in 1982, has been adapted to act as a Web server. It won't host terabytes of data or serve them at gigabit speeds, but you can use it to do basic home Internet tasks like reading e-mail. This proves that although Internet connectivity is something that wasn't widely adopted for home computers until the mid-1990s, a well-written TCP/IP stack is not nearly as complicated or demanding on computer hardware as PC snobs would have you believe.

Don't get me wrong: I enjoy having a PC with gigabytes of storage, 24-bit color graphics, and a fast Internet connection, and it's certainly true that such a computer can do more things than a decades-old machine. But there is a misguided sense among the general public that computers simply "get old" and become unusable after a few years. This is absolutely untrue. A computer never becomes too old to use; it will always be powerful enough to do the things that it was able to do when it was first released, and sometimes it will be powerful enough to learn new tricks as well.

The promotion of the idea that "old computers are junk" is quite disgusting to me; it is usually put forth to push some hidden agenda, except when it is parroted by unfortunate, misguided users who repeat this line because they have heard it so many times from people who seem to know what they are talking about. These hidden agendas typically take 2 forms: 1. A sales person who has a vested interest in selling more new computers, or a person who works for a computer company that has similar interests. 2. A person who sees computers as a way to glamorize themselves, much as people use expensive cars. Indeed, "old computers are junk" has a strong parallel in the automotive industry: Car salespeople will push recent-model vehicles, or any vehicle that is more expensive, and couch their sales pitches in claims that the vehicles are more reliable, safer, more luxurious, etc. The reality is obvious: THEY WANT MORE MONEY. Similarly, drivers often sacrifice huge sums of personal savings to buy a ridiculously expensive car, when the reality is that a basic mid-range car will do everything that most drivers need their car to do. Take a look at a street sometime: You'll mostly see midrange cars. How are all these people getting from one place to another if they don't have the latest, most souped-up car available? The only possible explanation is that you don't actually need the latest, most overblown tool to get mundane jobs done. Expensive cars are usually not for functionality; they exist so people can compensate for the lack of substance in their lives by acting proud of their overpriced vehicles. The same is true of overpriced computers.

The advancement of technology is good. It contributes many things to our lives. But technology for technology's sake is one of those stubborn social ills that has not yet left the minds of the general public, where it was pounded in so forcefully by greedy PC vendors. Wake up, everyone: The computer that you're looking at right now is probably good enough to do everything that you need it to do, and you will likely never need another one. A new computer might help you do things faster, but if you're going to get a new computer, be sure that you're doing it because it's something you really want to do, not just something you're doing out of pressure from salespeople or for "Keeping up with the Jones". Don't ever let anyone convince you that a tool which gets the job done "isn't good enough".


As a further commentary on the strange ideas people entertain about obsoleteness, many people have actually put forth the opinion that information rapidly becomes obsolete. While this may be true for some minor pieces of information (for example, if your phone number used to be 555-1234 but later changed, you probably don't need to remember your old phone number anymore), the simple truth is that a piece of technical knowledge retains it applicability and relevance forever.

I try to read a lot, and perhaps not surprisingly, I end up reading a lot of technical and how-to books about computers and electronics. Often, when reading other people's reviews of these books on websites, people complain that information in the book is old and outdated because the book was written for technology that came out 5 years ago. I am never able to understand the logic behind what these people write; because a technology has been around for a few years, it is automatically useless, and therefore so is a book about it?

Some of my most useful books are on the core technologies of the personal computer: The protocols and standards that largely defined the computer industry as it developed through the 1980s, and, by extension, today. For example, I have a couple of books on the ISA bus, which was the bus commonly used in the original IBM PCs and PC clones of the 1980s. The ISA bus was the standard used in virtually every PC for about a decade, until the PCI bus was developed and eventually eclipsed the use of ISA; in fact, ISA stands for Industry Standard Architecture, which is perhaps a little overly boastful, but quite accurate, because ISA was and is the standard that the PC was built on, and always will be so. Most PC-architecture computers today have abandoned ISA in favor of PCI, but ISA is still the standard that defined the PC industry for a decade, and therefore, anybody who pretends to know anything about computers must be familiar with the ISA standard. Yet people deride ISA because it's "too old".

Whenever I hear somebody claim a technology to be irrelevant because it's more than 5 years old, my response is often "Well, how old are you? If you're more than 5 years old, that means you must be obsolete too, right?" I'm not sure how I would address this to a person who was actually younger than 5 years old, but I haven't had this conversation with someone that young yet.

Some people argue that this analogy is inappropriate because people don't age at the same rate that technology does, and while I'd have to agree with them, they usually seem to take the opposite stance that I do: They claim that technology ages faster than people. Actually, it's the other way around. Isaac Newton has been dead for hundreds of years, yet the laws of physics he developed are just as important today as they have ever been. Similarly, people will still be talking about the number ã (pi) and the equation e=mc2 a hundred years from now. How many people can assuredly claim that kind of longevity? Not many; probably not any. The conclusion is simple and obvious: Information never dies. You and I both will.

The ISA standard will be important forever. Just because there aren't as many people using it as there used to be, that is no reason to think that knowledge of it is useless. ISA will still need to be understood a hundred years, even a thousand years from now. It doesn't matter if there are only 5 people in the universe still using ISA then; it will still need to be known. Actually, it doesn't matter if *NOBODY* is using ISA anymore. It will still need to be understood.

Many people talk about "history" and how it's important to study history. They read about ancient kingdoms and societies that existed literally thousands of years ago. They consider the study of these matters important. Yet these same scholars will turn around and denounce ISA, a standard less than a century old, as "obsolete", and the study of it a waste of time! The double-standard being employed here is so glaring and myopic that it's utterly outrageous that children are being taught in schools about wars that were fought generations ago, yet not about the standards that continue to define our world today.

Don't even get me started on TOPS-10. It is very difficult to find good books about it in bookstores today, even though it was released as recently as 1964. Yet walk a few aisles over, and you can find writings by a playwright named Shakespeare, who died in 1616! So, tell me: Which knowledge is more "obsolete"? The answer is neither. Both of these fields of knowledge will be relevant forever.


The computer industry has entered an era where many people, even people who work with computers all day, have developed a terribly illogical way of thinking about them. Nowhere is this more blatant than in the field of software troubleshooting. Today, even a majority of system administrators deploy a troubleshooting methodology which is not far removed from "shotgun debugging", the act of making random changes to code in the hopes that you'll eventually stumble upon a fix.

Any person who thinks about a computer logically--and indeed, any person who has taken an absolutely entry-level programming class--knows that a computer performs a series of instructions in sequence. A computer program is nothing more than a list of instructions for the computer to follow, one instruction at a time. The way to understand any piece of software, then, is to understand the flow of processes within the software. A single instruction within a program usually means very little, because performing almost any useful function usually requires several instructions. This means that knowing what individual instructions do is of little value; rather, understanding how each instruction fits into the larger framework of a program is the true essence of understanding that program, just as a single word taken out of context from a book has little meaning, but a sequence of words put together can form meaningful concepts. In light of this, then, the way to understand a computer program is ultra-simple: Start reading the code for the program at the beginning, and read through to the end until you have understood all the instructions the program contains. You then understand that program and what it does.

This idea may seem almost inanely obvious to some, yet unbelievably, people today seem to have a tendency to devise improvised solutions to software troubleshooting which, at best, boil down to educated guesses about what the program might be doing. You gotta wonder: Would these people employ the following thought process to reading a book?:

Hmm, I wonder what this book is about. How could I find out? I could give it to someone else and ask them to read it and tell me. But they might lie to me. Hmmmm. Let me open it up and take a quick look inside. Aha! I see the word "two" inside! Maybe the book is about two of something. At the very least, some part of the book seems to be about two of something. How could I get more information about this? Hmmmmmmmmm.

Anyone who has ever read a book knows that this is no way to discover what a book says. How do you find out what's in a book? You open it, and start to read it. When you have finished reading the book, you stop. A computer program works exactly the same way. When you've read a computer program, you will not be lost when an error happens, because you will know exactly what the function is of the piece of code that failed. You can do a step-by-step debug of the program to analyze exactly what happens at the point of failure. Then you will have all the information you need to create a fix.

It's ironic that although computers are among the most logic-based devices ever invented by humankind, people still think they can understand computers without using simple logical concepts. No wonder people have so much trouble fixing any computer problem these days. The negative effects of shotgun software analysis go beyond just fixing glitches, though; this kind of thing is also what leads to computer viruses. On the Internet today, people are often given entirely illogical advice to avoid getting infected with viruses and other malware. For example, people are actually taught about concepts as absurd as "trusted sources", as in the common advice that you should only run a program that was sent to you via e-mail if the file came from a "trusted source", usually someone you know personally. This, of course, is completely ineffective, since even friends and family sometimes unwittingly e-mail each other virus-infected software. Typical advice therefore often boils down to "Don't run any programs that you receive at all". This defeats the purpose of having a computer so strongly that it's actually quite outrageous: Create a machine with the ability to exchange software programs between people via electronic mail, then tell them they can't actually run those programs at all. Why not tell car owners that they should simply avoid driving since you never know when you might crash?

All of this is pure lunacy when there exists a simple and 100% effective way of preventing viruses: Know what every piece of code does before you run it on your computer. When you read the code to a program that is sent as an e-mail attachment, you will not have to wonder or guess as to whether the program is a virus or not. You will know for a fact, because you will know exactly what that program does. There will be no need to worry about viruses using such a simple rule.

People who run computer programs without knowing what those programs do are living dangerously. Ask yourself: Would you eat food without scanning every molecule of the food with an electron microscope to ensure there is no poison in it? Of course not; you eat smart, which is why you are still alive. Would you shake someone's hand before taking a urine sample from them and doing an analysis on it? A savvy member of the human community knows better than to take such foolhardy risks. So why should software be any different?

When the simple principles of cause-and-effect are applied by users to the software and hardware they use, computers will be much more reliable and there will be less risk to users everywhere.


Even computer programmers are not invulnerable to thinking things about computers which are misguided, incorrect, and even harmful. (This latter possibility was given a famous real-world illustration in Edsger Dijkstra's 1968 article "Go To Statement Considered Harmful".) The reality is that programmers are, of course, still human, and therefore prone to the same kinds of mistakes that other people make.

One of the greatest failings that seems to afflict almost every mainstream programmer today is laziness. Most of the fundamental shifts in computer programming paradigms over the past 10 or 20 years have had nothing to do with improving the capabilities of computer software, but rather, simply making things easier for the programmer. Now, I am not opposed to making tasks easier if there are no negative effects from doing so, but computer programming is fundamentally not the world's most simple task, and it is not meant to be as trivial as turning on a light switch or putting on some socks.

The main problems with simplifying the act of programming are twofold; first, it tends to lead to inefficient code--"easy to use" programming languages are often difficult to efficiently convert into the machine language that a computer needs to run on. Secondly, it tends to lead to programmers who have no idea what they are doing.

When computer programming was still in its infancy, programming was mainly about the computer. Today, however, the emphasis has long been on allowing programmers to program while focusing on what they want their program to do, while being able to ignore the technicalities of how the computer works. This might sound great to the incidental programmer who just wants to make one program that works and never write another computer program again, and indeed, perhaps "toy" languages are appropriate for such people, but for any person who is serious about programming, the increasingly high levels of programming languages create multiple cumulative layers of insulation between the programmer and the computer.

In the very first days of computer programming, all programming was done in machine language. High-level languages simply did not exist. This was a time when efficiency was very greatly valued in software; programmers would spend hours trying to find a way to eliminate a single CPU opcode from their programs. This was admittedly an extreme, and today computer memory and storage are available enough that it might not be practical to spend great lengths of time to pare down program size by a few bytes, but neither is it practical to go to the other extreme that coders have gone to by assuming that resources are so copious that you almost have to try to run out of them.

Nor does it make sense to focus on what the program does, rather than what the computer does. Today, the focus in the programming world has long been to lean away from thinking in terms of CPU registers and memory access cycles, and instead to focus on data structures and user interface. Many seem to believe that the best way to achieve this focus is to invent higher-level languages which automatically package and organize data structures such that the programmer does not even need to think about the memory cells or the access times associated with making practical use of those data structures. While such "black boxing" of the processes of a computer system might make programming appear to be easier, it also creates limitless potential for errors and glitches created by programmers who simply do not understand what their code is really doing. When you create code at a very high level without thinking about what is really going on inside the computer's silicon, you never know when you may accidentally create a memory access violation, or a timing glitch that causes one signal to arrive before another signal that was supposed to arrive first. These kinds of hardware-based gremlins are all too common in computers, and since they are indeed eminently hardware-focused, you'll never understand them, or even know they're there, if you don't code with regard to the hardware.

Being able to program while thinking about your program's goal, rather than concentrating on the computer, sounds utopic indeed, but the truth is that such black-boxing has always been appropriate only for amateurs; professionals are supposed to know about the underlying technicalities of what's going on. Would you feel comfortable going for heart or brain surgery performed by a surgeon who is not familiar with the anatomy and physiology of the human body, but who has read step-by-step instructions on how to perform the operation? ("Cut artery A. Cut vein B. Suture wound C. Refer to figure 5-24.") Such cute and simple packaging of complicated procedures is just plain dangerous. If you'd feel nervous about having such a surgical procedure, then so too should you feel nervous about using computer software written with the same mindset.

Ironically, despite many modern programmers' claims that very high-level programming allows the coder to focus on what their program does rather than what the computer does, the truth is that machine language provides the most direct control over the machine, and therefore will actually be least likely to result in unpleasant surprises where the program does something other than what the programmer intended. This is not to say that all computer software absolutely must be written using machine or assembly language, but these should be used whenever possible, because they will translate into the most direct correlation between what the programmer wanted and what the computer actually does. There are cases where somewhat higher-level languages can be quite useful for taking care of simple jobs that translate readily into machine language (like math or pure data-transfer operations), but these must be used judiciously, and with a complete understanding of the consequences for doing so, just as prescription drugs can save a person's life, but taking or prescribing them without understanding their potential side effects or interactions may be fatal. Once again, it's not about trying to make things deliberately difficult just for the sake of making life harder; it's about being savvy about what you do so that you don't accidentally create a bad situation.


There continues to be an ongoing push to "simplify" the design of a computer's user interface. One of the focal points of this effort was, historically, the part of the computer that most users look at first: The front panel of the computer's case. Although this has largely ceased to be a point of simplification for the computer industry (largely because front panels can't get any simpler than they are now), it's worth taking a look back at how the front panel of computers have changed over time.

If you look at a home computer from decades ago, you might see controls that do not typically exist on computers today. A classic example is the Altair 8800, widely credited as being the first mass-produced personal microcomputer. This pioneering machine has, on its front panel, lines of LEDs which indicate the state of every bit (1 or 0) of both the computer's address and data buses. In addition to numerous lights, there are also switches which allow you to do several important things:

- Set whether the CPU is running or paused (probably via the READY pin on the CPU)
- Single-step a single instruction from memory
- Control the values on both the data and address buses
- Individually read or enter a value to/from every single byte in the computer's memory

None of these features are likely to be found on the front panel of a typical microcomputer today. Instead, control of the memory and buses must be done indirectly through software (no small task when no software has been installed on the machine!), and there typically is no way at all to control whether the CPU is running; the CPU is just running all the time, and the user is helpless to do anything about it.

A more recent example is the "turbo" button which became common on PC clones of the 1980s and early 1990s. This was a toggle switch which allowed the user to switch the computer's CPU between two operating speeds. Although this feature was intended to promote software compatibility (for the sake of some software which expected the CPU to run at a specific speed), the turbo button ended up being useful for many other purposes as well. Sometimes, you just want your computer to run a little slower for some reason, perhaps so you can observe some process in greater detail. Some microcomputers had a similar software-based feature which allowed you to go into the ROM BIOS and toggle the CPU speed, but having a hardware-based switch that you could readily press was most useful. Yet the turbo button is something of a relic, almost completely vanished from computer cases today. Even motherboards rarely have pins for such a button to connect to.

A typical microcomputer today has one or two buttons on the front of the case (barring eject buttons on removable-media drives, the number of which will vary depending on how many drives are installed): A power button, and, if you're lucky, a reset button; there is so much emphasis on making computers user-obsequious that in many cases, even the reset button is absent, leaving a single power button. The result is a computer case with functionality as complex as that of a light switch: On and off. That's all you get.

Why has this been allowed to become the "normal" configuration of a computer's case? It seems that computer manufacturers have formed an opinion that switches, buttons, and lights are "intimidating" to users, and that users must be "shielded" from being able to control a computer for the user's own protection. This is, of course, foolish and wrong. The omission of a reset button from the casing of a computer is not only technically poor design (because it means the only way to reset the computer is to power-cycle it, which is mechanically undesirable), it is actually insulting to the intelligence of users everywhere. It is offensive, from a human perspective, for anyone to suppose that a simple push-button to reset a computer is too complicated for users to handle.

You wouldn't feel comfortable driving a car with no brakes, no instrument panel, and no way to start or stop the car. Yet this is exactly what most PC makers are giving people today, by making computers with no start/stop switch for the CPU and no readout of the system buses. How do you stop the CPU without shutting down the computer completely? You can't. How do you know what memory location the CPU is reading, so you can ensure that confidential data is not being accessed without proper authorization? Without a facility as simple as a string of lights, you can't. These crippled machines are actually dangerous and unsound.

Yet people continue to claim that the computer hardware of today is vastly superior to the hardware of yesteryear. This is clearly not true in all respects. While it is true that most computer hardware today is much faster than hardware of 20 or even 10 years ago, speed is not the only important factor in designing any machine; indeed, in many cases, it is not even the most important factor. In computers, usability usually trumps speed, because after all, what good is a lightning-fast machine that you can't control? What makes this situation very unfortunate is that these lackings in hardware are not the result of any technical limitations; rather, they are the result of misguided design decisions on the part of computer engineers who choose to deliberately cripple their designs to satisfy some misinformed edict regarding "ease of use", rather than simply choosing to make data bus control easy-to-use.

Today, after purchasing a new computer, it is necessary to manually solder wires onto the leads coming out of the CPU (no small task if the CPU is a ball-grid array!) and affix them to a system of switches and lights so that you can effectively use the computer. This doesn't sound like a simplification to me; it sounds like more trouble and more work than having such a set of switches and lights already installed and ready-to-use on the front of the computer. Once again, far from making things simpler for the user, dumbed-down design has actually created more work and more trouble for the user.

When the original IBM PC was released, IBM made available for it the legendary IBM PC Technical Reference Manual, which included not only full electronic schematics for the hardware, but also complete source code listings of the machine's ROM BIOS. Today, PC clones routinely ship with their cases sealed shut with stickers reading "WARRANTY VOID IF REMOVED", and often are constructed in non-standard ways which are specifically designed to be difficult to disassemble, or even impossible to non-destructively disassemble without specialized tools. It is a genuinely shameful state for the microcomputer industry to be in. If you're going to get a closed-architecture, dumbed-down computer that you can't even open--which is a patently useless machine in the first place--you might as well get a Macintosh, and at least have it come in a color that will match your decor.


Speaking of the Macintosh, I must admit that there was a time when I had nothing but the utmost respect for Apple. There was a time when they led the microcomputer industry in a very special way that no other company managed (with the possible exception of Commodore). The Apple II remains one of the very finest computers this world has ever known, a marvel of craftsmanship on both the hardware and software level. But this was a much earlier era in the computer industry, and decades have passed since the introduction of the Apple II. Since then, Apple has gone on to produce inferior, watered-down computer hardware not fit for any purpose whatsoever. Yet people still continue to adore them.

As I write this, Apple recently announced "Boot Camp", a system to allow someone to run Microsoft Windows on an Apple Macintosh computer. Predictably, reactions to this announcement were wildly mixed, but a great many people said it would be a great idea to allow users to run Microsoft Windows (arguably the industry's de facto standard OS) on the "superior hardware" of the Apple Macintosh. I figure now is as good a time as any to take a look back at the Macintosh and explain why there is nothing superior about it.

By far the biggest failing of the Macintosh is simply its secretive, closed-ended design. A real computer has expansion slots which allow any plug-in card to interface directly with the computer's address and data buses; besides this, a real computer comes with printed documentation detailing every pin on this expansion bus. The Macintosh comes with no such documentation, because most Macs have no such bus. It is true that some of the high-end Macintosh models (which seem to be intended for use as servers rather than desktop machines) have real PCI slots, but few end-users seem to be buying these models, and Apple has certainly not been proactive about advertising them, preferring to relegate these machines to a dark corner of their product catalog while proudly trumpeting tiny, emasculated machines. Even the "high-end" Macintoshes which have some real expandability have no more functionality than a basic PC clone, yet they cost much more than a PC clone would. Perhaps people have fallen for some smoke-and-mirrors marketing trick to convince them that they're getting what they pay for, i.e. the Macintosh must be inherently superior because it's more expensive.

To put it very simply, the Mac is not a computer. By the very definition of the word, a computer is a device that can be programmed. Macs are designed, from the ground up, to be closed-architecture. How does being inferior by design make computer hardware superior? The Apple II came with a hugely innovative and useful feature: A full-featured debugger built right into the ROM. You can access it from any Apple II by simply typing "CALL -151" at the command prompt. Where is a similar feature in the Macintosh? For that matter, where is the Macintosh's command-line interface?

Real computers generally need a command-line interface to get anything done, but the Macintosh is designed with a mouse-centric mentality that hobbles what the user can do. Not only does a GUI hinder the user's level of interactivity with the computer, it also tends to stifle programming ideas. The Mac was designed to be a machine for the user who wants to run word processors and web browsers, not the user who wants a truly programmable information appliance. It wasn't until OS X (basically version 10 of the Macintosh operating system), which is based on Unix, that the Mac even had a proper command-line interface as an option; even OS X is essentially an ugly GUI pasted on top of Unix. If you're going to get a Unix-based machine, why not get a cheap PC clone and install some free version of BSD or Linux on it, thereby paying less than half the cost of a Mac? You can even install XWindows on *nix if you really want a mouse-driven GUI. Remember when Microsoft Windows first came out, and everybody said that Microsoft ripped off the idea of a GUI from Apple? That was true then, but things have come full circle now: Apple is trying to imitate the PC industry. Apple has ripped off not only the idea of a command-line interface from the *nix world, but also the actual Unix operating environment itself. More recently, Apple has copycatted the PC industry by switching to Intel CPUs, such that the transformation of the Macintosh is now complete: It uses the same hardware and software as a cheap PC clone. The only difference is, it comes branded with the Apple logo so they can justify insane, monopolistic pricing practices.

The Macintosh is not even in tune with Apple's much-hyped "Think different" slogan. This advertising scheme is intended to appeal to people with too much money and time on their hands who want to imagine themselves as "rebels" because they "flaunt the status quo" by buying the exact same computer as all their pretentious friends. If someone buys a Macintosh, they're buying a mass-produced piece of plastic that looks exactly like every other Macintosh out there. If a computer buyer wants to be truly creative, they would buy a nondescript white-box clone computer, and paint its exterior themselves. (The large, flat exterior of a plain PC case makes an ideal painter's canvas.) By accepting the factory-default Macintosh, these self-important posers are accepting someone else's design. Isn't it exactly the opposite of "Thinking different" to calmly accept what's made for you by a major corporation like Apple, rather than creating your own identity by thinking creatively?

Like so many other people, I have experienced the enchantment of turning on a Macintosh and entering that cute and cuddly world in which everything is made to look like it works seamlessly. Unfortunately, if you see that environment for what it really is, the enchantment is all too brief; beneath that veneer of translucent windows and simplified error messages is a computer that runs on chips and circuits like any other. At some point, the Macintosh's undying efforts to shield you from its workings will start to get in your way rather than making your life easier. The Mac has always had its place as the computer for people who truly and genuinely do not want to know anything about computers; for the crowd who vehemently wants to remain as non-technical as is humanly possible, the Macintosh is a computer for you. Anyone who wants to be any kind of a power-user, however, has always been able to recognize the Mac for the shoddy hoax that it is. The Mac is a toy computer for casual users, and it fills this sole niche well.


I'll be honest: I don't know very much about computers. I truly don't know the first thing about computers. You know how they say that the beginning of wisdom is to understand that you don't know anything? They're largely right. A computer is complex enough that a person with any real understanding of them knows that most modern computers are beyond the full comprehension of any human being.

If you say that you don't know the first thing about computers to some people, though, their initial response is "Well, you know how to turn a computer on, right?" This is a bit like saying that the first step to becoming a millionaire is to spend a million dollars. In actuality, spending the money is what you do AFTER you get it; first you need to make that money. Similarly, turning a computer on isn't the first thing about computers; it's the last thing about computers. The first thing about computers is purifying and doping silicon.


One of the reasons for the regrettable lack of new ideas in the computer industry is a quantum shift in what the industry and its users think that computers are about. This change was mainly brought on by the Internet.

Mention computers to a typical person today, and they will usually immediately think of the Internet. To many non-computer people, the Internet and computers are virtually synonymous; there seems to be little perception of the "personal" computer left. Now the computer is nothing more than a terminal with which to connect to the network.

While the Internet performs many important functions and is of notable significance to the computer industry, it has also transformed the idea of what computers are for in the public's mind. Today, the computer is seen as a communications device, like a telephone. People use the computer to send e-mail and chat online. This is one reason why the computer has taken on diminished signifiance in the household: You can use a telephone (whether wired or cellular) to talk to people more easily than you can use a computer for this purpose. Since these two devices perform the same task, why use the larger, less simple computer when the ultra-familiar telephone is near at hand? This viewpoint grossly misunderstands the capabilities of what a computer is capable of.

Since so many people seem to have forgotten, let's take a short trip back in time and remember the many ways 1980s computer ads patiently explained to us how computers could be used at home:

A computer is a device for learning. A single floppy disk can store the text of several books. A single CD-ROM can store the text of an entire bookcase. Today's hard disks can store a whole library's worth of text. Not only can a computer store a great deal of information, it can also present it in ways that a book cannot, augmenting information with animation and interactive activities that help reinforce learning in ways that are sometimes more effective than raw information. This is not to dismiss books, of course, but the simple truth is that information in a book is static, while information in a computer can be made "live", adjusting itself as circumstances require.

A computer is a device for experimentation. Within the memory of a computer, it is possible to create virtual environments in which you can test ideas and hypotheses. The classic "Game Of Life" is a very simple example of an artificial-life experiment that can be easily and quickly performed on a computer, but countless other possibilities for testing similar ideas exist. You can perform vehicle-collision simulations on a computer. You can perform chemical-reaction experiments on computers. You can perform human-interaction experiments on a computer. All of these and more are made possible by the simple capability of a programmable information device.

A computer is a device to create with. Many programs exist which allow the user to create graphics, music, and even films (animated or real-world). You can create quality art without having to spend thousands of dollars on expensive studio supplies and equipment. Non-computer people sometimes see computers as boring and "logical", but computers can also be artistic.

A computer is a device for exploration and entertainment. Computer games aren't just for fun; they can educate as well. The world of a computer game can be as large as the human imagination, and within that virtual world, you can put as much content as you want. You could create a world that would theoretically take a lifetime to explore. Or you can just have fun with it.

A computer is a practical information appliance. You can use it to store data, which can then be later formatted and organized in useful ways. You can use a computer to remind you of important dates. You can use a computer to do math calculations. All of these everyday functions can be integrated into even the most simple of computers.

Perhaps most importantly, however: A computer is a device you can program. This means that if the computer cannot currently do something you want it to do, you can program it to do what you want. (Or, if you do not wish to do this, you can have someone else program the computer for you.) This level of versatility is unmatched by any other device in the common household today.

All of these are things that could already be done with computers 20 years ago. At the time, people predicted that computers would be used for even more creative and exciting purposes 20 years down the line; what they didn't foresee is the inherent laziness of people. The truth is that most people are too lazy and unmotivated to learn, experiment, create, or program, and most gamers prefer the "twitch" action games typical of consoles over the more cerebral fare available on a computer. And so the computer is not used for any of the purposes it was originally touted for when the phrase "home computer" first entered the English language in the 1970s. Instead, people use the computer to do the one thing that people still love to keep doing: Talk.

People love to talk. Talk, talk, talk. Many people would do nothing but talk all day if they could. This is why the popularity of cell phones has exploded far beyond what is practical: The cell phone is not usually used as a practical device, either. It's a device for entertainment, a gadget to talk on for bored people with nothing better to do.

With the Internet, the computer took on its current main role: A device to communicate on. Since then, that role has almost eclipsed all others.

This is not a technology problem. The technology to take computers far beyond their current applications has existed for decades. This is very definitely a human problem. The root of the problem is easily observable by simply watching everyday people in everyday environments; regardless of what they do for a living, their income level, or where they live, people mainly do two things: 1. What they have to do, and 2. Talk. The typical adult has a job of some kind which they do because they have to do it, and often people have to do other tasks like washing dishes or laundry, but when people are not required to do anything, they will usually resort to talking because they have nothing else to do. In a store, in a library, at home, eating, riding public transportation, or simply enjoying leisure time, it seems to be the human tendency to want to do nothing but talk. This obviously consumes valuable time which could be spent doing better things, but this an essay on computers, not social commentary, so rather than going on about why this inclination to talk endlessly is foolish, let's just say this:

The problems that exist with technology today are not technical in nature. The problems that exist with technology today are social in nature. They stem from problems that are deep-seated in the core of human society. Only when people realize that a computer is much more than just a device to chat on will we see the return of the creativity and innovation that once imbued the computer industry. Ironically, the once-touted ability of the networked computer to enable communications was the worst thing that ever happened to the computer.

The computer is not just a device for communication. The computer is a device to handle life's information needs. When used correctly, the computer augments the human mind, and enables people to think more quickly and accurately than ever before. Although the human mind will always have the edge on "right-brain" artistic and feeling-based thinking, the computer can do "left-brain" mathematical, scientific, and logical thinking better than any human. When you make the computer work for you the way it's supposed to, you'll find yourself doing amazing things with your computer you would never have imagined.


Many software programs are now being sold online. This seems like an obvious distribution model, as software is generally regarded as not being a physical product; therefore, you can cut down on materials and shipping costs by not having to buy media and boxes to package the software into. All that remains is the bits. Furthermore, since those bits can be downloaded, there's no waiting for ship times. You can get the software as fast as your network connection allows.

While this is an effective distribution method for some types of software, many people have taken the rather extreme view that all software will be (or should be) distributed this way. This viewpoint misses the basic fact that software is not just bits. Software is the feel of a manual against your fingertips. Software is a properly-labeled disk to store the program on and to reinstall from if you need to. Software is pleasantly-designed box art to hold the bits that lie within. All of this context is lost through network-only distribution.

It's must the same case as with books: Many people believe that books are destined to become read on computers rather than on paper, but you simply can't forgo the simple act of holding a book in your hands and turning the pages as you read. Some books may be appropriate for electronic conversion, but you can never completely replace paper. Some software could be effectively distributed online, but software must always be made available in boxes, sold on disks, with paper manuals (PDF manuals are not enough) and warranty registration cards which, when sent in, earn you a lifetime subscription to the software publisher's paper-based catalog, mailed to you at least 4 times a year.


Some people, upon reading this page, have speculated as to whether the viewpoints presented here are real or satirical. The implication seems to be that no person in the world could actually be as extreme or correct (or extremely correct) as I am, and therefore the person who wrote this page must have been simply exaggerating some points for humorous effect. Of course, people who believe this are wrong and should be punished.

Back to the main page