MIT Media Lab cofounder Nicholas Negroponte observed at a recent TED event that “I look today at some of the work being done around the Internet of Things and it’s kind of tragically pathetic.”
The “tragically pathetic” label has been especially fitting for wearables, considered the hottest segment of the Internet of Things. Lauren Goode at Re/Code wrote back in March: “Let me guess: Your activity-tracking wristband is sitting on your dresser or in a drawer somewhere right now, while it seems that every day there’s a news report out about an upcoming wearable product that’s going to be better, cooler, smarter.”
All of this was going to change when Apple finally entered the category with its smart watch. Many observers hoped that Apple’s design principles, obsession with simplicity, and track record of delighting users with easy-to-use products, are going to finally give the world a useful and fun wearable.
Instead, we got a good-looking wrist-top computer. Not a simple, intuitive, and focused device but a generic, complex product with too many functions and options. Kevin McCullagh wrote in fastcodesing.com: “I can’t help but think Steve Jobs would have stopped the kitchen sink being thrown in like this. Do we really need photos and maps on a stamp-sized screen, when our phones are rarely out of reach? For all the claims of a ‘thousand no’s for every yes,’ the post-Jobs era is shaping up to be defined by less ruthless focus.” Back in June, Adam Lashinsky already made this general observation about the potential loss of the famed product development discipline: “Apple, once the epitome of simplicity, is becoming the unlikely poster child for complexity.”
“Complexity,” however, does not tell the whole story. By introducing a watch that is basically a computer on your wrist, Apple missed an opportunity not just to reorient the wearables market to something much better than “tragically pathetic,” but also to define the design and usability principles for the Internet of Things.
In his TED talk, Negroponte highlighted what he called “not a particularly enlightened view of the Internet of Things.” This is the tendency to move the intelligence (or functionality of many devices) into the cell phone (or the wearable), instead of building the intelligence into the “thing,” whatever the thing is – the oven, the refrigerator, the road, the walls, all the physical things around us. More generally, it is the tendency to continue evolving the current computer paradigm—from the mainframe to the laptop to the wristop computer—instead of developing a completely new Internet of Things paradigm.
The new paradigm should embrace and evolve the principles of what was once called “ubiquitous computing.” The history of that vision over the last two decades may help illuminate where the Internet of Things is today and where it may or may not go.
In 1991, Mark Weiser, then head of the Computer Science Lab at Xerox PARC, published an article in Scientific American titled “The Computer for the 21st Century.” The article opens with what should be the rallying cry for the Internet of Things today: “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”
Weiser went on to explain what was wrong with the personal computing revolution brought on by Apple and others: “The arcane aura that surrounds personal computers is not just a ‘user interface’ problem. My colleague and I at the Xerox Palo Alto Research Center think that the idea of a ‘personal’ computer itself is misplaced and that the visions of laptop machines, dynabooks and ‘knowledge navigators’ is only a transitional step toward achieving the real potential of information technology. Such machines cannot truly make computing an integral, invisible part of people’s lives.”
Weiser understood that, conceptually, the PC was simply a mainframe on a desk, albeit with easier-to-use applications. He misjudged, however, the powerful and long-lasting impact that this new productivity and life-enhancing tool would exert on millions of users worldwide. Weiser wrote: “My colleagues and I at PARC believe that what we call ubiquitous computing will gradually emerge as the dominant mode of computer access over the next 20 years. … [B]y making everything faster and easier to do, with less strain and fewer mental gymnastics, it will transform what is apparently possible. … [M]achines that fit the human environment instead of forcing humans to enter theirs will make using a computer as refreshing as taking a walk in the woods.”
Ubiquitous computing has not become the “dominant mode of computer access” mostly because of Steve Jobs’ Apple. It successfully invented variations on the theme of the Internet of Computers: The iPod, the iPhone, the iPad. All of them beautifully designed, easy-to-use, and useful. All of them cementing and enlarging the dominance of the Internet of Computers paradigm. Now Apple has extended the paradigm by inventing a wristop computer. That the Apple Watch is more complex and less focused than Apple’s previous successful inventions matters less than the fact that it continues in their well-trodden path.
While the dominant paradigm has been reinforced and expanded by the successful innovations of Apple and others, the vision of ubiquitous computing has not died. Today, when we are adding intelligence to things at an accelerating rate, it is more important than ever. Earlier this year, I asked Bob Metcalfe what is required to make us happy with our Internet of Things experience. “Not so much good UX, but no UX at all,” he said. “The IoT should disappear into the woodwork, even faster than Ethernet has.” Metcalfe invented the Ethernet at Xerox PARC at the time Weiser and others were working on making computers disappear.
Besides ubiquity, there are at least two other dimensions to the new paradigm of the Internet of Things. One is seamless connectivity. In response to the same question, Google’s Hal Varian told me, “I think that the big challenge now is interoperability. Given the fact that there will be an explosion of new devices, it is important that they talk to each other. For example, I want my smoke alarm to talk to my bedroom lights, and my garden moisture detector to talk to my lawn sprinkler.” No more islands of computing, a hallmark of the Internet of (isolated) Computers.
Another important dimension of the new paradigm is useful data. Not big or small, nor irrelevant or trapped in a silo, just useful. The value of the “things” in the Internet of Things paradigm is measured by how well the data they collect is analyzed and how quickly useful feedback based on this analysis is delivered to the user.
Disappearing into the woodwork. All things talking to all things. Useful data. It may not be Apple, but the company or companies that will master these will usher in the new era of the Internet of Things where we finally get over our mainframe/PC/Wristop computer habit.
[Originally published on Forbes.com]
Reblogged this on Leaders in Pharmaceutical Business Intelligence.