The Computer Industry After A Decade Of Loss by Adam Luoranen January 2009 When computer industry pundits try to predict the future, it's understood that there is always a certain amount of uncertainty involved. No one knows the future, and predicting the future of a relatively fast-moving field like technology is even more questionable. In general, even among experts, visions of what will happen in the future are often shaped by personal opinion, bias, or whim as much as by real information. A less often-recognized truism, however, is that differences of viewpoint exist even regarding circumstances existing in the present. No crystal ball, psychic ability, or even particular insight should be required to see what is happening in the immediate present, yet events of today are still interpreted in such a wide variety of ways that what one analyst sees as a world-altering event could easily be dismissed as business-as-usual by another. They say that hindsight is 20/20, but it really isn't; different people have vastly different views of the past, present, and future. If you doubt this at all, just spend some time talking to a Holocaust denier, or ask a few different economists why the Great Depression *really* happened. It is perhaps instructive, then, that today, in the early days of 2009, an event which most so-called computer experts are searching for in the perhaps-not-too-distant future is an event which has already occurred, unacknowledged by most publicly-prominent media figures. The disappearance of the computer industry is not a future prospect. It is not something which requires speculation. It exists in the present. It has already happened. The question is neither if it will happen nor when it will happen, but rather, "What do we do now that it has happened?" A key reason why this development has gone unnoticed by so many for so long is the tendency of image-conscious industry icons to shapeshift in endless, desperate attempts at still appearing relevant. In big business, sometimes marketing and PR matter more than actual money, and if you can fool most of the people all of the time, or all of the people most of the time, then the payoff is hoped to lie a little farther down the road. Not too much farther, because big businesses (especially technology businesses) rarely plan seriously for more than a few years ahead, but far enough that short-term loss for the sake of effective image management is an acceptable loss. Today, if you look at the names which dominate the computer industry, things haven't changed too much from 10 years ago. You still hear a lot about Microsoft, Intel, and Apple. Most PCs still run Windows, Linux is STILL the upstart new operating system that's set to become the next big thing any day now, and almost all people who use computers--both those who are computer savvy and those who are not--worship networks in general, and the Internet in particular, as the only real reason to have a computer, as if a non-networked computer is about as useful as a wrench with no handle. What nobody--certainly not the big businesses that remain in the industry, and apparently not the computer users either--wants to acknowledge is that the computer industry has not had a single major innovation in the past 10 years. Sure, computers have gotten a little faster, broadband Internet access has become somewhat more widely available, and hard drives have certainly become larger, but none of these have significantly altered the framework of how people use computers, for either business or personal uses. Most people still use computers to trade e-mail, surf the web, use basic office applications, and play games. The so-called innovations that have marked the past decade, like the rise of blogging, so-called "social networking," and "Web 2.0" are neither new ideas nor innovations, but simply embarrassments, old ideas in new packages marketed as the next new thing to computer-illiterate "experts" who gobble up these developments as the future of the computer industry, despite the fact that they have nothing whatsoever to do with computers. On the hardware side of things, the only major development has been that people are shifting to mobile phones and away from the traditional desktop computer. In other words, people don't use computers anymore; a mobile phone, no matter how feature-rich and networked, cannot even begin to approach an actual computer in terms of expandability, flexibility, or solderability. People know this and admit it openly, yet they don't really care, suggesting that perhaps Ken Olsen was right after all when he made his famous statement, in 1977, that "There is no reason anyone would want a computer in their home." People don't really want computers, they just want cute, shiny toys. Ask anyone who bought the relatively new BlackBerry Storm what drew them to the phone, and they'll happily talk about how cute the phone is, how nice the graphics look, and how easily it slips into their pocket. People aren't quite sure what the phone does, but it doesn't really matter as long as it looks cute so they can continue to vomit up more verbal diarrhea about how nice the phone looks. If people really want something small and cute that fits into their pocket, might I suggest that a photograph of a LOLcat would probably be cheaper and more amusing. Back when computer engineers had a place in society, the phrase "liberal arts" tended to have negative connotations among engineers. It immediately brought to mind visions of people who served no practical purpose, who studied Roman history and French art because they had nothing better to do. Engineering, by contrast, has traditionally been seen as an eminently practical pursuit: The path of those who build the systems that make the world go round. Yet if computers have no practical place in society anymore, where does the computer engineer fit in? Today, I feel like my study of computer architecture, CPU instruction sets, and semiconductor materials is more like the study of history than anything else. It is the study of information that might once have been useful and relevant, but now is little more than incidental trivia. Despite all my training and studying in computer engineering, I feel like I might have ended up being a liberal arts major after all. Perhaps part of the problem is that people continue to imagine a link between education and industry. They invent stories implying that the things a person learns in college or university are somehow connected to the things a person does in commercial jobs, when the reality has long been that it ain't so. Ask any career engineer how frequently they use knowledge of calculus or differential equations on the job; the majority don't. For centuries, universities have taught students the essence of things, with the attitude that such understanding was not taught to increase their employability, but as an end in itself. Conversely, businesses don't concern themselves with essences, but rather with profit. People sometimes say that if you're smart and educated, you can get a good job, but this really isn't so; a business will only employ someone if they believe that person can be profitable to them. It has nothing to do with intelligence or education. When you analyze the fundamental nature of the computer versus the nature of business, you realize that they cannot work together; they are simply not compatible. The computer is a tool to liberate and augment the mind, while business is a tool to force or fool people into giving you money. Business has a way of warping everything it touches. What was simple becomes complicated when viewed through the haze of marketing plans. Simple, clear language becomes obfuscated and misleading, twisted in such a way as to provide maximum profit for a company. The fundamental disconnect between business and the computer world can clearly be observed in movements which, thankfully, have been slow in their onset, but are nevertheless inevitable. (I use the word "movements" rather than "trends," because these are, sadly enough, unlikely to be trends; they're here to stay.) One of these movements is the move toward user-based operating systems. Windows NT started this, and subsequent versions of Windows which were based on it carried on the "new normal" of operating systems which not only require users to log on and thus consider them just another user account on the system, but also use file systems that associate a list of user permissions with EVERY file on the system, turning the whole computer into one giant police state. This makes sense on a mainframe with several users, but what nobody seems to understand is that a home PC doesn't need permissions, and shouldn't have them. Why would you want to lock yourself out of your own computer? Why does your computer need to log you in each time you want to use it, as if you were a suspicious entity whose movements needed to be tracked? It was not so when home computers were real. Multiuser operating systems are not appropriate for a personal computer. Similarly, consider the thin-client movement. This is a war that's been ongoing for many years, and it's still far from over, but basically, many people have been touting the effectiveness and efficiency of providing all users with stripped-down terminals that have little processing power of their own and get all their data from a server upon which they depend for all their data and functionality. Again, there may be some significant advantages to this from a business perspective, because it means employers can force all their employees to use the same data set and take away all users' computing power, forcing them to do everything on some server that is controlled by management. However, this same idea is utter anathema to the idea of personal computing, in which a user owns their computer and has complete control over all processes and data within that computer. What's good for a personal computer user is often the utter opposite of what's good for business. Computers can not, *should* not be mixed with business; if they are infected with business, they lose their effectiveness at computing. A more perfect demonstration of this concept could not be imagined than the slow but steady decline of Apple. In 2008 and 2009, for the first time in history, Apple computers seem poised to actually become more widely-used than IBM PC-compatibles. This only happened when Apple made a firm commitment to making computers as terrible as they could possibly be. It is tragic, but also instructive, that the company that was largely responsible for creating the personal computer industry with two of the best computers ever made--the Apple I and the Apple II--is also the company that ended it all. The place where it all began is the same place where it all ended. This is where it ends: With computers designed for people who want to do inane things like edit or watch videos. At least half of the people who own Macbooks or iPhones could not name the mnemonic for a single opcode that is recognized by the CPU in their Apple devices, let alone a mnemonic's hexadecimal or binary code. Perhaps nothing can illustrate the end of the computer industry more effectively than the rise, in 2008, of the popularity of "netbooks," small laptop computers that weigh only one or two pounds and are almost unimaginably inexpensive. Small handheld organizers like this have existed for decades, but netbooks are different because they aren't just pocket organizers. They run real operating systems and can install and run normal PC applications. Netbooks are distinguished only by being smaller than cheaper than other laptops. In all other respects, they are true computers, programmable and expandable. They can perform all common tasks that people typically do with computers, except run complex, hardware-intensive games. Yet they are so incredibly cheap that almost anyone can afford them. Having reached this point of universally functional and affordable computing, I can only come to one conclusion for my computer-industry colleagues: We both won and lost the war. We really did produce the dream that people had been envisioning. We really made machines that can store as much real data as most people need, connect wirelessly to a global information network, and do it all in a way that's energy-efficient and affordable. Where, then, do we go from here? Why would we need to do anything further? Sure, you can always play the numbers game and make chips with more transistors, hard drives with more bytes, and productivity applications with more useless features that no one uses, but how long can you keep that up before people lose interest and realize that there's no point to it anymore? If Steve Jobs had his way, he would convince people that the answer is "Indefinitely," but he is a businessperson, which is to say, he is a person who is paid to tell lies that will convince other people to give him money. We won the war, but now the war is over, and there are no positions for us left. Ultimately, almost any form of paid employment can be placed into 1 of 2 categories: Progress or maintenance. People with progress jobs create progress by achieving new things. They design hardware or software, or they write things, or even just come up with new ideas. People with maintenance jobs don't create new things; they simply look after things and make sure that everything is running smoothly. People who have progress jobs are, by the very nature of their employment, in the business of making themselves obsolete. Once their progress is complete, they either need to create progress somewhere else, or they are no longer needed. Our progress is done; we've created the computers people always wanted, so why would the industry still need computer designers? That leaves maintenance people, and when computers are so cheap that they cost more to fix than to replace, maintenance people also become obsolete in a profit-driven environment. Even people in the industry now admit it. There can no longer be any realistic denial of it. Ask any person who works in the so-called "computer" industry: In a present-day business environment, the people in the greatest demand are not those with strong technical skills, but rather those with a strong understanding of business principles, and some idea how computers can be shoehorned into that framework. Incredible though it may seem, the contemporary notion of computer people has reduced their status to join the ranks of accountants, marketers, salespeople, managers, and the like: "Businesspeople," the class of people who, in the absence of an artificially-created business environment, would rightly be regarded as having no useful skills. The "computer" industry is not populated by those who studied calculus, electrodynamics, and signal integrity, but rather people who think about money a lot, and whose idea of "computer literacy" is comprised primarily of knowing how to push a mouse around on a desk and use a search engine. (Please note that the actual minimum definition of "computer literary" is being able to draw a schematic for a 16-bit CPU using MOSFET symbols with no prebuilt ICs.) It's 2009, the last year of the first decade of the new millennium, and the computer industry has suffered a decade of steady losses. The statistical number of people who work in the computer industry has fluctuated somewhat, but the actual number of computer workers has gone steadily down. The statistics are inaccurate because they tend to consider people like Java programmers as being employed in the computer industry, even though a programming language is not a computer and has nothing to do with computers. Edsger Dijkstra famously said that "Computer science is no more about computers than astronomy is about telescopes." Everybody knows it deep down inside, they just don't want to admit it. You may wonder why I am writing this in early 2009 instead of waiting for the end of the year. Technically, it hasn't really been a decade yet; it's only been 9 years. Something could still happen. Maybe between now and December, there could still be a complete turnaround, and there might still be an economically-useful reason to know something about computers. But it doesn't really matter. The events that have been set in motion cannot be undone, and any short-term reversal would only be temporary. The greatest characteristics of the computer industry of the 1970s and 1980s--creativity, excitement, passion, understanding, sharing, joy, and fairness--are not found in the world of business. They have no place in a world motivated by profit and competition. I'm writing this now, while I still have some measure of familiarity with the pulse of the computer industry. This may well be the last article of this nature I write. Hopefully I will move on to better things. To my colleagues in the field: Best wishes to all of you.