I’ve recently been reading quotes by Theodor Nelson, the early computer technology thinker, and finding some of his thoughts remarkably similar to (and yet others at odds with) my ideas on core concepts, interfaces, and the rationale behind The Ultimate PC Primer. Some people read Ted’s statements and dismiss him as a curmudgeon in modern times. I don’t subscribe wholesale to all of his ideas, but I definitely think he has a great number of insightful perspectives on technology and the state of computing that we can learn from.
Ted’s motto is allegedly:
A user interface should be so simple that a beginner in an emergency can understand it within ten seconds.
I largely agree with the intention behind the motto, and I think there are two ways to approach that issue. The first would be to try to build the elusive perfectly intuitive interface. I think the computing industry (and I include the mobile market in that, since smartphones are really just small computers now) are slowly starting to realize that there may be better ways of heading toward that goal. But to borrow from computer legend Alan Kay, the real computer revolution hasn’t really begun yet.
Short of building miracle interfaces that can adapt to everyone for any imaginable purpose, the other short-term solution might be to simply standardize the interface concepts.
Ted Nelson is quoted as saying:
In order for something to Catch On, it has to be standardized. Unfortunately, there is motivation for different companies to make their own little changes in order to restrict users to their own products. The best example of how to avoid this: Philips patented its audio cartridge [i.e. the standard audio “cassette”] to the teeth, but then granted everyone free use of the patent provided they adhered to the exact standard.
Newer audio media formats than the audio cassette have been invented today, so this example might not seem potent at first. (Feel free to think back to when it was or substitute some new technology.) In order for innovative new technologies to be of use to the general public, they have to serve us at some point rather than vice-a-versa. I think that’s the bulk of many of Nelson’s grievances with the current state of “technology.” It occurs to me that by standardizing a physical technology, different manufacturers make devices the same, giving consumers a narrow target which they can focus on, learn, and readily adopt. (For example, a tape player from Toshiba works the same as one from Sony, etc.)
What might have caught on fully enough to meet Ted’s standard? I don’t know, but I’d like to think that the wired telephone measures up (and perhaps the horseless carriage, to a lesser extent). It’s one of the most pervasive devices, and ironically enough, it is used in emergencies in less than ten seconds. But it also took a lot of research, development, and specific training to get it to the point of general adoption and consistent usage. So even that device, as I pointed out in my prior post, isn’t perfectly intuitive.
On intuitive interfaces, Nelson says:
Almost nobody, looking at a computer system for the first time, has the slightest idea what it will do or how it should work. What people call an “intuitive interface” is generally one which becomes obvious as soon as it is demonstrated. But before the demo there was no intuition of what it would be like. Therefore the real first sense of “intuitive” is retroactively obvious.
Ted’s right, of course. Today’s interfaces are “retroactively obvious” (unless you’ve been one of the users climbing the stairs since the beginning) which is why I’ve had to write a book explaining interface concepts. Nelson and I both would have preferred such explanations not be needed. (Ted, if you’re reading this, I share your frustration of having to explain the concept of the computer’s “desktop.”)
So can this issue be solved by standardization? If standardization is meant to serve users, what’s standard on PCs? Talking about software now instead of hardware, has there been enough standardization in PC interface concepts for common PC usage to “Catch On,” at least for most novices who are willing to learn? On graphical user interfaces (which Ted would like to call PUI to acknowledge their birthplace), he says:
All of these clumsy, lookalike interfaces are based on those designed at Xerox PARC in the early 1970s.
And that’s my point. The core concepts of graphical user interfaces haven’t changed significantly in many years, which is why I’ve been able to write The Ultimate PC Primer, a book that is really designed to explain the core concepts that seem to be leveraged by most personal computer operating systems in use today. But here’s the difference between what Nelson called “standardization” for hardware and what I might simply call “baselining” for PC concepts: PC interface concepts haven’t been formally standardized, agreed upon, and adhered to by everyone making software. It just sort of worked out that they’re all (so far) pretty similar (at least for operating systems). This is fortuitous for computer users, because it could be immensely harder for consumers if every interface was drastically different. (Case in point: the varied mobile phone interfaces prior to the touch-screen smartphones.)
Think about today’s innovation-driven tech market. Few few companies seem really willing to agree, for the benefit of consumers, and lock in on a particular standard (though I think they’re starting to learn to balance innovation and consumer need, as evidenced by the HD-DVD vs. Blu-ray battle that was won relatively quickly compared to the VHS/Betamax war or even the Web browser wars, in which there was no clear winner while consumers suffered.) Again, just take a quick look at the smartphone scene and you’ll see as much innovation as can be cranked out with very little “sameness” between vendors. Who suffers? Just ask one of my colleagues who just upgraded to a new smartphone.
So the other option I might propose in this day and age is not worrying as much about standardizing the exact technology deeply (so that every phone and phone network is pretty identical, though I’m sure that would make a lot of things more efficient), but doing so at a shallower level — having the consumer-facing implementations of a technology ride on core concepts which are standardized. The intended result? The user-base can focus on learning the core concepts, achieve a baseline of knowledge and understanding, and go on to use just about any technology “within 10 seconds,” as Ted might say. With interface concepts at least standard, users stand a chance of achieving some kind of baseline knowledge that they can leverage for real productivity, despite which exact system or device they’re using. Since personal computer concepts seem (for better or worse) to be reasonably stable and consistent, that’s my target for baselining computer users’ acumen.
Mr. Nelson probably wouldn’t agree with my approach. He would probably think I’m giving in to using metaphors to cover over the limitations of current personal computers and the crutches and vices of their programmers. Ted clearly seems to hold in low regard the current state of computing and the World Wide Web, and I do agree that we’re serving the technology a little too much rather than it serving us. But what is here isn’t going away soon nor is there much motivation to change it. To be honest, it still astonishes me how accepted tenuous technology is. Technology consumers are willing to adopt cutting edge, flaky, immature technologies, and that will drive the market, the innovation, and the manufacturers’ willingness to play with all kinds of concepts they will likely refuse to make standard. Apparently there are enough techie consumers to keep feeding the trend. Maybe the companies that control the market are entirely populated by techies who think only of developing things for themselves (and the narrow band of people very much like them.)
Whatever the case, tech-driven culture seems to now be the reality, and I choose to accept that this is the current state until it can be made better. But I also knew current state isn’t serving a large number of current and potential users well. And it likely won’t change until more consumers learn what’s really going and and say, “Hey, this stinks. Can’t it be made better?” That’s why I’ve spent a number of years writing a book, posting to this blog, and having nearly weekly conversations about the effect of this stuff on real people.
Ted would like to see the world changed and so would I. Understandably, he seems to want the limitations to be thrown off so that there don’t have to be “interface standards.” I respect that ideal, but I’m choosing to change the world starting with where it is today — limitations included — and my approach is to leverage core concepts and standards, loose though they may be. It has never been my goal to make everyone computer geniuses or to make technology so smart it makes our world like Star Trek, but I am committed to providing a way for newcomers to get to a foundational “baseline” understanding, from which most other computer technology will make sense as they find themselves engaging it daily. Technology can’t just be for the technology-elite. Though the current generations growing up with technology will likely have less difficulty (initially), I believe there is a need to establish a “baseline” for what constitutes adequate proficiency. Without an understanding of standardized core concepts, every generation will forever be playing “catch up” and never “catching on.”