Dan Bricklin's Web Site: www.bricklin.com
|
|
Thoughts on the culture that gave birth to the personal computer
Some reactions to Walter Isaacson's request for comments about a rough draft of a section of his next book
|
|
Walter Isaacson, the experienced author who wrote the popular book about Steve Jobs, has posted the start of "The Culture That Gave Birth to the Personal Computer", a sketch of a draft of his next book on the innovators of the digital age. He asks for comments. Here are some of mine. These comments provide an additional viewpoint from a person who did participate in, and contribute to, the personal computing world. I hope he finds them helpful.
The draft of that chapter implies, as I see it, that an initial vision for computing was for personal use, but that somehow instead computers evolved into big things for industry and the military. That is, they continued that way until some hippie, anti-establishment renegades brought it back to personal use.
I hope this is not the thrust of the book. I don't believe that it will be (see this potential part about Bill Gates in a Harvard publication), but just in case, here is my reaction.
I think that would be a misunderstanding of how the personal computer came into being. Vannevar Bush's vision was always in people's mind. Vannevar Bush, and later others, did not build personal computers because they didn't want to: They didn't build personal computers because it wasn't possible at the time for reasonable prices. Many engineers were always making computers more and more personal when they could.
The driving forces were not LSD and the anti-establishment sentiment. Some of those that participated in making the personal computer happen may have been hippies and anti-establishment, but certainly not all were nor did it matter as much as other factors. Many people had the vision of personal use of computers and strove to move it closer and closer to reality. Other factors, such as science fiction, probably had more influence. (I say this as one who slept in the mud at Woodstock in 1969 and had hair almost down to my waist when I went to Harvard Business School in 1977, so I have nothing against hippies. I also watched the movie 2001 in 1969 and Star Wars in 1977.)
The story of which people were influenced by which science fiction may be an interesting exploration. (For example, there is the story of iRobot co-founder Helen Greiner and the inspiration from R2D2. See Turning Inspiration Into Our Own. We all know about how Arthur C. Clarke's HAL 9000 introduced many people to the dangers of centralized computing.) Just looking at who predicted or named something that became common in the future is not enough. The actual line of what was the spark for whom may be more useful. A chart of many sources of inspiration and the lineage to final, adopted product might be interesting. The inspiration may be in childhood, for example reading Tom Swift, Jr., books. (Those books, as I recall, were very empowering to an engineer. They asserted that technical prowess could make great things and help the world. It's just that the actual "how to do it" was missing. We ended up providing that when we grew up. Steve Wozniak is one of many who have mentioned those books.)
All sorts of people took to personal computing when it came out. Not just anti-establishment people who took drugs. It resonated with doctors, accountants, Wall Street people, and many others. They bought them and worked to find uses for them. When humans get a tool, they find uses for it in their lives.
An example of this is Ben Rosen. He was an analyst at Wall Street's Morgan Stanley. He was struck by the personal computer back in the late 1970s. He, and others, helped popularize it with financial people. VisiCalc helped -- it was a tool that spoke to MBAs (being designed by an MBA helped...). Those people went on to buy personal computers and fund development through investment. Once word processing became inexpensive on affordable personal computers people in the press understood its value and sung its praises as more than a curiosity.
Speaking of the press, the role of the early hobbyist publications, like Byte, Kilobaud, Creative Computing, and Dr. Dobbs, is important. The first two (related) were from Peterborough, NH, and came out of the ham radio world. Creative Computing came from David Ahl, who in the early 1970s worked at DEC and pioneered putting a PDP-8 minicomputer in a desk (not on it) to make a person-friendly computer for the education market. This same device became the basis for a small business computer and then DECs first word processor around 1975 (which I helped develop). DEC working on word processing meant that they needed to help the daisywheel printer businesses that were starting then to meet their standards of reliability. (DEC was a major producer of low-end printers at the time, but did not make letter-quality printers.) Those printers became the preferred output device for many Apple II configurations.
Technology like computing continually evolves. There are many steps along the way taken by many individuals and companies. If you follow stories of the history of engineering, such as those from the writer Henry Petroski (e.g., The Evolution of Useful Things), you'll see some of this. Writing of only those who also have interesting personalities or stories and ignoring the importance of the links in the chain (and the failures of those that tried) misses that. (See Henry Petroski's To Engineer is Human: The Role of Failure in Successful Design.)
The idea many have that computers had been the "province of rich and powerful institutions, who, understandably, have developed them primarily as bookkeeping, sorting and control devices" is somewhat myopic and ignores economics and what was going on in computing, a field much wider than just the mainframe business.
Computing technology, like all electronics, started out very expensive and voluminous for little power (storage and computation). The initial uses could only be for small amounts of numbers with life and death of a country at stake (such as the early military uses that funded many early computers in the 1940s and 1950s, initially artillery and later flight simulator control and missile tracking). Small amounts of data with little computation lent itself well to accounting use in business. We are talking about only hundreds of bytes per person, spread out among thousands of people in a large company, with just some simple arithmetic. Techniques were developed to share an expensive computer among many people at once, as if it was their own. (I remember studying for a French test in around 1968 using a GE Timesharing program I wrote for the purpose.) As hardware got less expensive (thank-you space program and Moore's law, among others) we could start thinking about an engineering project using a small computer, such as a DEC PDP-8, which fit on a desk. Further advances in miniaturization and cost reduction brought us to word processing, starting out as one computer for one secretary -- tens of thousands of bytes per person with several thousands of dollars for the equipment. The Intel 8080 and Motorola 6800 chip brought computing into cash registers and dedicated devices. The MOS Technology 6502 (a major drop in price thanks to a company with a major Pennsylvania presence) and the latest generation of memory chips finally brought the price of enough computing to run a tightly written Basic system down to an affordable purchase by dedicated hobbyists. Later advances meant that we could do computation with larger amounts of data, like photos and then video.
The advance wasn't because of a change in attitude. The change was the economic availability. All along the way in research institutions around the world people were using expensive computers for "personal" use, exploring what could eventually become common. A DEC PDP-1 computer was used for the pioneering Space War video game at an MIT research lab -- in 1962. We used text formatting programs on various research computers in the 1960s and early 1970s to do personal word processing well before Wang and later WordStar. Employees at companies like DEC salvaged parts from the scrap heap and built simple computers for their homes. Ivan Sutherland developed Sketchpad at MIT in 1963, a WYSIWYG drawing program that inspired people from Douglas Engelbart to Alan Kay (who was later one of his students) to even me years later. The Architecture Machine Group at MIT (later to become the Media Lab) did experimentation. As costs came down, we who had been exposed to this experimentation grabbed the current technology and tried to apply it. Often it was too early, but eventually the timing was right. This is similar to other engineering areas.
The implication that DEC, which pioneered in bringing entire computers to the desks and offices of individuals, could be the main symbol for "the establishment" ignores history. DEC is probably one of the most important companies in the road to personal computing. The fact that it was also sort of the Moses that didn't get into that promised land (mainly as an example of Clayton Christensen's Innovator's Dilemma) is sad but not surprising. (There was a contingent within DEC that wanted to make a personal computer when the 8080 came out -- there were prototypes that I saw, and they actually had various packaged PDP-8 and PDP-11 devices that could have been sold as such. Ken Olsen just didn't believe in it for regular people for Innovator's Dilemma reasons, as I saw it then when I was at DEC. They also were wedded to their instruction set which precluded the cheaper hardware at the time and they had a booming business in PDP-10s and VAXes both with their system and UNIX.) Dismissing DEC the way many do shows a lack of appreciation for that history. So many of the pioneers of personal computing learned from and were emulating DEC. (Bill Gates has related how he learned so much by reading listings of software for a DEC computer he found in garbage cans at the Computer Science Center. Gary Kildall got his inspiration for CP/M from DEC and used DEC computers in his work.)
I hope Isaacson is traveling around the country to learn about early personal computers from Radio Shack and Commodore. In addition, there were lots of other personal computers in the early days from all over the country (and world), too. Most of them didn't have management as skilled (or lucky) in business as the later "winners", but they all had variants of the same vision. (Example of luck and business acumen: For various reasons VisiCalc was released on the Apple II first, about a year before other personal computers of the day. Jobs said in 1990 in an interview for BBC/WGBH at 35:49, "And so if VisiCalc had been written for some other computer you'd be interviewing somebody else right now.") The eventually most popular personal computer was developed in Florida by people who were involved in IBM's minicomputer development.
It's also interesting how many people involved in personal computing have a background that includes Harvard, MIT, Brown, Carnegie-Mellon, University of Illinois, BBN, DEC, and other non-California institutions in addition to Berkeley, Stanford, and Xerox PARC. Maybe instead of just interesting people there could be fertile places? (Which had interesting people, of course, but then all places do.) What we now know as personal computing (the applications and the hardware) came from lots of parts, some of which came more from one location than another.
Interviewing the winners is sometimes a problem. They often believe they were the only ones working on something or don't understand that what they take for granted or built upon had a long history. To understand the march of a technology, you need a bigger picture.
With computing, we have been constantly evolving towards the personal computer, always using whatever tools were available at the time. Trying to pin things on a few visionaries while ignoring the people who actually made all those steps happen without maybe even knowing that it fit someone else's vision, too, is bad for society. It makes us forget that we need investment in that constant evolution and it makes us think we as a society can just wait for some savior to point the way and all of the pieces will magically be there. Yes, one person can make a big difference (as I know personally), but they are not alone (as I also know). It's fun to watch the golden spike being put in to finish the railroad, but a lot of problems had to be solved to get to that day.
It would be nice if a writer of Isaacson's ability and stature could help here and paint a more realistic picture from which policy makers can learn. It would be wonderful if he could find a way to weave his special people stories into a tapestry that included all of those without the special stories but who made it all possible and had many of these same visions. They needn't be there by names, but the importance of the fabric they create to the final result must be made clear. A few gold threads does not make a tapestry.
The culture that gave rise to the personal computer was the long-held belief in a computer as a tool for everyone and the culture of engineering constantly improving on what we have.
-Dan Bricklin, 23 December 2013
|
|
© Copyright 1999-2018 by Daniel Bricklin
All Rights Reserved.
|