|dan (67) myron (1) rich (61) shiloh (4) :: Contact|
Wed, 31 Dec 2008
The year end is a time for looking back as much as for looking forward. People compile their lists of "Top Ten" movies, music, political events, and the like, for the year just past. Others look forward, placing their moistened fingers in the air, and, having discerned the direction of the prevailing winds, more or less confidently prognosticate on what the year to come has in store for us.
I hold no special claim to any abilities in the area of predictions -- if I did possess any, I would undoubtedly be spending more time at the racetrack and the casino than I do now. Instead, I use these short days at year's end to look back, not only at the preceding twelve months, but over even longer time frames. Having also turned fifty this year, I look back at my life so far, and find myself asking (among other things) about how fortunate I seem to be to have grown up at just the right times to have been in on the start of a true historical era or two.
One such historical era at which I saw the light of the first dawn was that of manned space travel. I've written about being a space buff earlier in this forum. NASA and I are almost exactly the same age, and despite the interregnum of the space shuttle era, history books already look back on these days as a watershed, when humans took their first tentative steps out into the wider universe around us.
A recent post on a friend's blog entitled "I'm an 8-track guy in an iPod world" discussed, among other things, how the onrush of technological change had left even him, a technically savvy guy, puzzling over how to set up the iPod he and his wife had given their seven-year-old daughter for Christmas. For most kids today, electronic gizmos of all kinds are truly no brainers, I think mostly because they've never known a time when such things did not exist.
Having experienced feelings similar to my friend's more than once upon being confronted with the latest and the greatest of things technological (including my own four-hour wrestling match with my wife's iPod a couple of years ago) I got to thinking about other historical eras I have been privileged to witness from the early days -- in particular, those involving the onrush of technology. One such area involves a facet of technology which I have been involved in academically and professionally for (what is now) the larger part of my life: computers.
I cannot claim to have witnessed the actual dawn of the computer age -- many historians agree that the modern computer age actually began in the early 1940s, with ENIAC and the other pioneering electronic digital computers. My dad was involved in the computers of that era, graduating from MIT in the early 1950s and promptly going off to join a (then somewhat fledgling) company called International Business Machines. ("What do they do?" his father asked him on hearing the news.) So if there is such a thing as a genetic predisposition to working with computers, I probably had it.
No, my own experience with computers dated from a somewhat later time in the mid 1970s, when computer technology was taking a sudden and somewhat radical fork off of the evolutionary path down which it had been moving for the previous thirty-odd years. It was the beginnings of the era of the personal computer.
Most people today who are just a few years younger than I can only dimly understand what computers had been like since their early days: this was the era of the mainframe, when a computer was a device that took up the larger part of a room, had to be placed on special raised floors in glassed-off rooms, and which required special electrical and environmental equipment including tons of industrial-style air conditioning and massive power supplies. Only the most worthy acolytes were allowed into the actual computer room itself; the unwashed masses of programmers would instead bring their offerings of programs typed onto hundreds or even thousands of punch cards to a remote altar where the cards would be read into the computer, and where revelations from on high would eventually be delivered to the faithful in the form of printouts.
(Note, by the way, that the one thing always shown as "the computer" in movies and on television from the mainframe era was, in fact, most definitely not the computer: probably because they were one of the few moving parts, the classical image of the mainframe computer most people have is actually spinning tape drives, which were quite similar to (albeit much larger than) reel-to-reel audio tapes, and which were used to store programs and data. The actual "computer" part of the mainframe was mostly some lights, a few dials, and some switches, which was not quite as visually compelling as tapes going forwards and backwards at varying speeds -- when the computer is calculating, the reels spin; when the answer is ready, the reels stop.)
The underlying technologies from which computers were constructed continued to change and evolve. Vacuum tubes and plugboard wiring (think of a turn of the last century telephone operator connecting calls by literally plugging the two circuits together with wires) from the 1940s and 50s gave way to computer logic and memory circuits built with solid state components and integrated circuits ("chips"). By the 1970s, some technological innovators were thinking of ways to leverage technology to construct a computer's entire arithmetic and logic computation unit on a single chip. The earliest attempts to do this were popularized not by computer geeks, but by amateur radio enthusiasts, of all people. And by the middle of the decade, people began to offer computers in a kit form that was moderately affordable ($500-$1000 each, compared to the millions of dollars mainframe computers typically cost), and which could put together by someone with a modicum of electronics tools and skills.
It was at this time that my path crossed into the technological stream of personal computing, due to (of all things) a budget cut.
My high school offered a class in computer programming as a math elective for seniors. Since the number of graduates intending to go on to study science or engineering in college constituted a definite minority of the class, the majority wouldn't be taking calculus that year, and thus the programming class was very popular. However, as the school year was ending in June of 1975, the educational group which had been providing the timesharing terminal used for the programming course had its budget cut, and they rather abruptly and unceremoniously "pulled the plug" and removed from the equipment from our school. Now the Math Department was left with a hundred seniors scheduled for a computer programming class in September, but with no computer on which to actually run their programs.
Dad heard about the difficulties, and showed the head of the Math Department and the Principal an article describing a "personal computer" which would be suitable for the programming class to use, for the somewhat modest cost of only $995. If the school would buy it, he (and I) would put it together and test it to be ready to go in the fall. They immediately (and, to be honest, with a great deal of foresight in terms of the school's educational future) cut the check and sent in the order. A couple of weeks later, several large packages arrived, which I carried home on the school bus.
Now, people today don't think that spending $1000 on a PC is outrageous, and they don't think of the purchasing and setup process as being particularly onerous -- the computer comes in a box, and you plug in the color-coded cables to attach the power, keyboard, mouse, monitor, and network connection; press the "On" button, wait 45 seconds for the machine to boot up, and you're on your way.
The unpacking and setup process for those pioneering PCs was somewhat more involved than that.
For those of you who have ever looked inside you PC today, you've seen that most of the tiny electrical gizmos (resistors, capacitors, diodes, transistors, coils, chips, etc.) are installed on a single large circuit board called the "motherboard". But back in the covered wagon days, what you received was a collection of empty circuit boards, and boxes and bags full of gizmos. What that meant was that it was up to you to get out your soldering iron and assemble the computer's innards yourself, component by component and wire by wire. Working slowly and deliberately (mostly due to my lack of knowledge and experience with electronic assembly), this meant that Dad and I planned on six weeks just for assembly and testing. It was a pretty steep climb up the learning curve for me, but with Dad kibitzing during the evenings and on weekends, I eventually acquired a modest level of proficiency, and between the two of us whole thing was assembled in about two months.
The computer's startup was also a more elaborate ritual than simply turning on the power switch. That was the necessary first step, of course, but after that, nothing happened. There was no automatic boot sequence, no operating system like Windows or Linux waiting in the wings. Instead, what you had to do was use the toggle switches and lights on the front panel to enter whatever program you wanted to run -- manually, one instruction (a numeric code) at a time. When the time came for the first power-on test, Dad switched on the computer (no pops, sizzles, or smells of ozone or burning insulation, a good sign), and then I entered the ten instruction test program on the toggle switches. It was about as simple a program as you could have: starting with zero, the computer would add one to a storage register, and then display the result on the front panel lights, looping back to repeat the process indefinitely. Then, with not a little trepidation, I pressed the "Run" switch. The display lights all went dark, and then, too fast to really follow completely, the sequence of counting in binary arithmetic -- 0, 1, 10, 11, 100, 101, etc. for 0, 1, 2, 3, 4, 5 and so forth -- was being displayed. After having invested two months of spare time building the thing, and the school having invested a thousand dollars in the project, it was good to have wound up with a metal box about two feet square and about eight inches high which could at least count by ones.
The final step in the construction process was to add some hardware and software to the computer which would allow the students to actually use it to write programs in the fall. This would be done by plugging in a teletype machine to the communications port on the back of the computer, and loading up a program that allowed people to type in, run, and save their own programs -- much easier than trying to key stuff in using toggle switches. Saving programs would be done by using a punch on the teletype to record the program text as a series of holes on a strip of one inch wide paper tape.
After a couple of days of cleaning up and connecting the teletype, Dad and I took out the last small unopened box which came with the computer kit: a sheet of instructions, and a fan-folded strip of paper tape about 200 feet long. An adhesive label on the front of the tape described it: Altair BASIC Ver. 1.0, and a footnote on the instruction sheet identified the company which had created it: Copyright 1975 Micro-Soft. (Yes, this (at the time) two-employee company would eventually drop the hyphen from its name and become the juggernaut we all know today as Microsoft, and those two employees (Bill Gates and Paul Allen) would go on to become billionaires -- proof once again of the old adage "great oaks from tiny acorns grow".)
Now the startup sequence for the computer was a little more time-consuming: from the instruction sheet, I would key in a program that essentially told the computer to listen to the teletype for further instructions. After pressing "Run", I would attach the tape to the reader on the teletype, and start it up. It was not a speedy process, taking about twenty minutes for the entire tape to be read. Finally, when the tape ended, the reader stopped, the front panel lights dimmed, and after a few seconds pause, the teletype printer would clatter to life:ALTAIR BASIC VER. 1.0
at which point you the computer would now be able to execute programs written in the BASIC programming language used in the class. The computer had about 8000 bytes of memory (not megabytes or gigabytes, but plain old bytes, which is only about half the total number of characters in this essay). That's shockingly small by today's standards, but it was more than adequate to write and run some modestly sophisticated programs.
With everything working, we packed up the computer and the teletype into the back of the station wagon and drove it over to the school in late August, setting it up in the back of the Math Lab, and giving the teachers a demonstration of how it worked and how the students would use it. And with a few upgrades (more memory, a video monitor (actually a 9" black-and-white television), and a cassette tape recorder for saving and reading programs), this early personal computer served hundreds of students over the next five years or so, after which it was "retired" to a storage closet in the bowels of the school, where, as far as I know, it still exists today.
During those years, I myself went on to study computers in college, and then to get a job at a large bank where I would be able to use the programming skills I had started to develop back in high school on that early PC. At that time in the business world, computers were still squarely in the mainframe era, and that was the environment in which I worked. Unlike today, it was not the case that everyone from the mail clerks and the security guards to the Chairman and the CEO had a computer on their desk. We in the computer side of the business knew about PCs, of course, but the problem was that almost no one could figure out anything useful to do with them at work. Certainly it was impossible to get all the non-technical management and staff to learn how to write programs to do what they wanted -- that was our job, of course. The evolution of the personal computer had reached a plateau, waiting for the next evolutionary jump to move it forward again.
This jump came in the early 1980s thanks to a fortunate conjunction of hardware and software advances. The hardware advance came when the citadel of mainframe computing, IBM, decided to get into the personal computer business. The IBM brand name was the impramatur that businesses needed to seriously consider even buying PCs -- before that, few companies would be willing to spend large sums on computers from small, "no name" companies like MITS, IMSAI, Commodore, or Atari, who dominated the PC market. As the old saying went, "Nobody ever got fired for buying IBM." It also meant that IBM's famed field service teams were also available to help fix things in case of problems.
The software advance came from a company called Lotus Development Corporation, which had created a software package for the IBM PC called "Lotus 1-2-3". This was the not the first PC spreadsheet program (bonus points if you remember Dan Bricklin's "VisiCalc"), but it was functional and became wildly popular -- the first "killer app" for the personal computer. Using the spreadsheet metaphor that nearly all accountants and business people could understand, the average business person could use 1-2-3 to do their own financial analysis. Even managers for whom operating the car stereo was the extent of their technical expertise took to 1-2-3 to get their work done without having to wait for us programmers to write custom programs for them to run on the mainframes we still used. I've always attributed 1-2-3's success to the fact that Lotus went out of their way in the product manuals to reassure anyone using the program that this was just a kind of automated accounting, so as not to frighten off the overwhelming majority of business people who described themselves as "non technical"; in fact, the word "programming" only occurred once in the entire user manual, and then only to tell the reader that writing Lotus macros was "almost like programming". Writing macros was programming, of course, but the marketing for 1-2-3 was so slick that I was constantly amazed at the number of people at work I would eventually meet who would tell me that they didn't know anything about programming, "but I can write Lotus macros".
The synergy of IBM's hardware and Lotus's software fueled that next great leap forward in the evolution of the personal computer in the 1980s. Of course, hardware continued to evolve, and software evolved in sophistication and complexity along with it. In the 1980s, one of the biggest things you could do at work with your PC was to connect it up to the mainframe and use it like a video terminal; by the 1990s, local networks of PCs were widely deployed, and connecting to the Internet was the next big thing. IBM made a lot of money with their PCs, but they eventually overtaken by other companies and lost their cachet, no longer able to command a premium price for what eventually had become just another business commodity.
So we now would think it weird if a movie or television program showed an office without a computer on every desk. And the same large-scale circuit integration that made the PC possible has also meant that computers of one sort or another are now ubiquitous and essentially invisible: cell phones, thermostats, planes, trains, automobiles, televisions, music players, and gaming consoles -- these and many, many more things with which we come into contact every day, are all filled with embedded computer technology, essential but more or less invisible to us -- until, of course, you try to load some songs onto your kid's iPod.
Only then do I realize that my having been involved with personal computers literally from the beginning, while giving me a leg up and a head start compared to many people, does not guarantee that the ever-accelerating pace of technological change will not someday leave me in the dust. Sometimes I feel like the generals planning for the next war by assuming it will be much like the last one, and my kids, for whom so much of this is as natural as falling off a log, must feel like the the young colonels who don't understand what all the fuss is about.
After all, it's just a computer, isn't it?