I was truly speechless the moment I found out Steve Jobs passed away — so suddenly, only a day after Apple’s first big announcement without him at the helm. I wonder if he held on, wanting to see the transition to Tim Cook. Perhaps that was pure coincidence. Or maybe he held on for another reason: to see Siri announced to the world.
Siri was Apple’s largest news on October 4th, but many consumers shrugged it off, saying, “It’s cool, but it’s not an iPhone 5, like we expected. And this voice-driven technology has been creeping forward for some time. It’s hardly big news.” After recovering from the shock and taking time to reflect on what Jobs really did during his lifetime — making computer-driven technology usable for the everyday person — I’m thinking that, for Jobs, Siri might have been the beginning of something much greater that Apple has been dreaming about for a long time — something significant that would eventually, once again, lead to a dramatic shift in personal computing.
Steve Jobs and Steve Wozniak changed computing to be personal foremost by changing the interface. My lifespan happens to coincide with their work, so I can attest to the effect of their genius on ordinary individuals like me. I grew up on the Apple II in grade school, my family owned an Apple IIGS (limited edition, bearing Woz’s signature!), and I eventually moved on to use the Macintosh interface and the subsequent Windows graphical interfaces based on it.
The Apple IIGS had the first primitive GUI operated by mouse. The introduction of the graphical operating system was the pivotal shift in computing that added the “personal” to computer. Everything hinged on this until Apple redefined elegant human-to-computer interface again with the iPhone and iPad, leveraging touch and gestures to make the device even more natural and personal.
Now, a great many people are claiming that touch and gestures are the way we’ll all interface with computers in the future and that the mouse as we know it is dead. (Here’s one example: The Mouse Dies. Touch and Gesture Take Center Stage) And while I don’t deny the usefulness of touch screens to simplify interface, I’ve said before they aren’t a magic bullet. They actually intimidate some newcomers. (Don Norman, the great design guru, also highlights some issues in Gesture Wars.) Though the harmony between hand and screen has been streamlined, users still must understand graphical conventions. That’s why I’ve been posting for some time that core computing concepts still need to be taught to newcomers. And that’s also why I’m pretty excited about Apple with Siri, because I think Jobs saw it as the way to reinvent the human-to-computer interface once again…
Your voice is the next interface
Back in 1987, Apple released this futuristic video showcasing a voice-commanded personal assistant. (Note also that the implied date in the video is either September of 2010 or 2011, very close to the actual Siri announcement.)
Though the depiction above feels a little more like the ship’s computer from Star Trek: The Next Generation, Siri is likely not (yet) as advanced as the personal assistant in the Knowledge Navigator video. But it’s not hard to see the connection. And I’m by no means the first one who has drawn comparisons between Siri and Knowledge Navigator (and Star Trek for that matter). Just do a web search and you’ll find plenty of hits. But here’s the salient point…
If you could reliably control your computer by taking to it — like Star Trek — then the need for understanding graphical UI elements and gestures is significantly reduced and the barrier to computing (for newcomers) becomes virtually non-existent. Your voice doesn’t need to be taught. The interface is built-in and the conventions are only limited to what the computer can understand.
As the person who wrote the primer for newcomers learning to understand desktop PC interfaces, the prospect of using one’s voice as the primary interface absolutely thrills me. Think of this: no need to teach someone how to relate to the computer. No need to explain icons and procedures. The teaching of the “interface” is essentially offloaded to whomever teaches the individual to speak. No longer would computer vendors be burdened with GUI usability; they would only focus on voice command recognition. This would truly be a revolution in computer interface, and it’s only a matter of time before the technology is powerful and adaptive enough to provide this capability. Apple may simply be, as usual, taking a vision to market first.
The future of interfaces
While the mouse may be history very soon, I don’t think some artists will ever get away from a physical stylus or external device to assist with detailed pixel-resolution work. And touch screens are certainly here to stay. But I believe that Steve Jobs was preparing to take us to a place where both hand-powered interface and icon-based operation as we know it take a backseat. I’d like to know what Ted Nelson thinks of this, since he suggested that any person ought to be able to understand a computer within ten seconds. Ted, maybe the future that Jobs was planning to bring to us was not a world where we understand the computer, but where the computer understands us. If we are able to speak to our computer like we would a fellow human and have it obey us reliably, then anyone who can speak would, regardless of prior “computer experience,” be able to immediately accomplish the most common computing tasks without the overhead of required pre-existing mental models for software operation based on metaphors. And that would mean that, though he didn’t live to see it fulfilled, Steve Jobs would have once again orchestrated the rebirth of personal computing for ordinary people.