Myths of the computer world

There Is A Shortage Of Technology Workers, And Going To A College Will Help You Fill That Gap

The first part of this myth is half-true; It really comes down to what sector of technology you are talking about. It is popularly believed that all branches of computer technology have huge deficits in the number of workers needed, and that anybody with any kind of skill in the computer industry is ready to make a good living. This thinking is fueled by the constant reports that there are tens of thousands of technology job positions which lack people with the skills to fulfill them.

But a look at the job postings, and the people coming out of college, will tell you the real story. The truth is that the popular technology sectors have not only been filled, but flooded with more workers than are needed. Many people believe, to this day, that not enough people know how to install a video card or create a web page with basic HTML. This is not true; Essential web design skills and PC servicing knowledge are popular subjects for people new to computers, and by now we have more web designers and hardware techs than we need. If you are A+ certified or you know HTML, you'd best believe that this alone will not net you a good job. Maybe 5 years ago it would have, but not now. The supply of these kinds of basic technology workers far exceeds the demand. The result is a lot of umeployment among those people, and much lower wages among those who do have jobs in those fields.

Even the major development languages are pretty much full by now. Java, for example, is a hot technology; But so many people have learned Java that there is no longer a great need for Java programmers. It is not a completely flooded market yet, but with the way the industry is going, it will be soon; So many students are studying Java that it will soon be a common skill.

Colleges love to quote the figures, declaring that there is still a huge need for computer technicians and that a course in networking or web design will guarantee students a comfortable living. This is not true; The things that are taught in most technical colleges are exactly the very skills that are no longer needed.

Instead, the need for technology workers has shifted to the more exotic fields. An important field right now is that of CRM (Customer Relationship Management) and the database-like products it shares ties with; Within this field are a few major products such as PeopleSoft and SAP. If you are knowledgeable in one of these products, you certainly stand a much better chance of getting a job, and you are likely to make more money if you do get employed than a PC repair tech. Yet I have yet to see any colleges teaching these products; Instead, all the career colleges continue to offer the same stale courses in HTML, Java, and Windows administration... The same subjects that do not offer many employment prospects anymore.

Believe it or not, there are still many professional opportunities working with mainframes. Despite many people who believe that the mainframe era is long gone (and many ultra-hip colleges that try to make their students believe the same), many businesses are running on proprietary mainframe technologies, most notably AS/400. Yet I have not seen any colleges that teach any mainframe technologies. The reason for this is simple: It's not glamorous. Young people have been so tricked into believing that new, hot technologies are where the money's at that they want to learn things that everybody else knows. There seems to be a general lack of understanding that business is still business, even in the Internet Age, and businesses are more interested in what makes money than what's new and cool. If a career college advertised courses in mainframe education, they would be laughed at, and it would damage their reputation. That's why they don't do it; The purpose of a college is to make money. Make no mistake, the college itself is a business, and they are more in the business of taking students' money than educating those students.

If you want to see where the real money is, just go to some tech job listings on any of the major Internet job portals, and see what technologies the companies are really asking for. When you do this, you will start to realize that the real money lies in technologies you have probably never even heard of, and which are not taught by any college. The best education is still to teach yourself; When you teach yourself these skills using books, you will be a more marketable employee than most of the people coming out of the career colleges.

The Internet Was A Defense Project, Built For Use In War

This may or may not be a myth, depending on how you look at it. It is certainly true that the foundations of the Internet were laid by ARPANET, which was a network funded by the United States Department of Defense (specifically ARPA, their science and technology think-tank). Without ARPANET, the Internet as it exists today would probably not have happened. So, yes, much of the funding needed in the formative days of the Net came from the DoD, presumably because people there thought that an information network would have application during a war.

But to the technicians who actually built the network, war was probably the farthest thing from their minds; To them, the Internet was a community, a humanitarian project designed to allow researchers and academics to exchange information and ideas on a much broader scale than had been possible before the computer age. Hackers like to point out that none of the people who had a truly formative role in the making of the Internet (from the technical point of view) were government types who wanted to wage war. This may be true, and at least to the techies, the Internet was not a tool of the military.

But for ARPA to put that much funding into it, they must have seen some kind of military application in the Internet. Although the histories of the people who made the Internet usually focus on the techies, it would be naive to think that there was nobody at the top of the chain who, behind the scenes, had other plans for the national (and ultimately global) network that was being built. So this is only a half-myth, really.

A Cracker Can Penetrate Anything; or, This System Is Uncrackable

In the world of computer folklore, there exists the legend of the elite cracker, the guy who can get through any systen's defenses with sheer skill. It is popularly believed by the non-technical public that there exist crackers in the world who can crack virtually every computer on Earth with ease.

That may have been the case in the 1970s, but today, cracking is virtually dead. Decades ago, security was barely given a second or even a third thought in the computer world, and it was standard for computer companies to sell and ship systems with a series of well-known security holes, and for the recipients of those computers to run them with those holes still in place, usually not knowing they were there. Those were the days when most systems could be cracked with a few basic cookbook techniques.

Today, however, the world of cracking is much more difficult. The average cracker may pass through their life without a single major crack, and perhaps only a few minor ones.

The other side of the equation is the notion of an "un-crackable" system. The same element of popular mythology applies here: Sure, security is much better in computers than just a few years ago, but if somebody really wants to get into your system, they will find a way. It may take a long time (several of the greatest hacks may have taken weeks, months, or perhaps even years of studying the target's weaknesses), but in general, the dedicated cracker will find a way. This need for persistence, however, is exactly why cracking has died down considerably: Most crackers crack for fun or excitement, and it is not "interesting" to have to spend such a long time casing your target and learning about all the vulnerabilities of it.

BASIC Stands For Beginner's All-Purpose Symbolic Instruction Code

BASIC never stood for anything when it first was invented. It wasn't until later that people felt it would be cool to have it actually stand for something, and invented this contrived expansion for it. (This kind of acronym, which originally doesn't stand for anything but is later said to stand for something, is called a "backronym".)

BASIC Never Stood For Anything

Actually, it seems that BASIC really did stand for Beginner's All-Purpose Symbolic Instruction Code right from the beginning. The Jargon File claimed that it was not originally an acronym (which is why the above "myth" was listed here), and the Jargon File is such a well-known and widely-used resource that many people believed the same. More recent versions of the file, however, state that BASIC really did always stand for this, and earlier versions of the Jargon File stating otherwise were incorrect. Now nobody's quite sure what to believe.

Java And JavaScript Are Similar

Java and JS have nothing to do with each other. Java came first, from Sun Microsystems; JavaScript came second, from Netscape, and shared nothing with Java except a name association, a clear attempt at trying to get people to *think* they were related. Clearly, judging by how many people confuse the two, Netscape succeeded in this.

The Market Will Ensure That The Best Products Are The Most Popular; or, If A Product Sells Well, That's Because It's The Best

This is untrue even in other industries; Many people, for example, buy inferior cars simply because they have a preference for that particular make or model. Unfortunately, the complexity of the computer industry makes this myth even more untrue, because of what I consider the most important need in any computer product, be it hardware or software: Compatibility.

Although different models of cars use very different parts that are generally not interchangeable, cars can still interoperate peacefully, because they all operate on the same principle: They roll around on wheels, and can share the road just fine if they don't hit anything. But in the computer industry, it gets much more complicated than that: Computer products need to interoperate, and that's incredibly hard to do in a world where each manufacturer wants to create their own standard, each one usually incompatible with any other. This leads to the need for universal standards, because if there were none, each product in the world would require the purchase of a whole new computer to use it. Many people remember the early days of micros when there was the Commodore 64, the Apple II, the Amiga, the IBM PC, and other less well-remembered platforms to design for. The poor software companies often ended up releasing their software in a version for *all* these platforms, since each held a significant market share. Although this gave the power of choice to consumers, it also made it incredibly difficult to design a product that could work with a majority of computers.

Of course, eventually the IBM PC won the battle, and although the PC benefited from an open system design that the other platforms didn't have, many would allege that the PC was not the "best" of the micros on the market at that time.

Today, the computer world is struggling with a huge battle between operating systems, the likes of which has scarcely been seen before. Indeed, the battle is so significant that it spills over from the computer world into the mainstream (although this is largely because computers are more important to the mainstream than ever before). Those familiar with the history of Windows are well aware that everybody uses it simply because Microsoft gained a significant market share with MS-DOS in the early days, and simply built on it from there. It's clear that Microsoft has long ago stopped having to produce the "best" operating systems to get the biggest piece of the pie.

Even Linux, "the little operating system that could", heralded by many as their personal saviour from the Microsoft monster, is already crumbling under the industry's ruthless proprietization, even before it has a chance to become big: Linux is now mostly used in commercial distributions, such as Red Hat, PowerLinux, or SuSE. Can you see where this is going? That's right: These different distros of Linux are slowly becoming incompatible with each other, leading whichever version has the biggest user base (which clearly seems to be Red Hat at this point) to be the one that most Linux developers write for, regardless of whether it's the "best".

Back to the main page