Why The Future Is Not What It Used To Be

by alex-xs

by alex-xs

The IEEE Computer Society published in March a report titled “What Will Our World Look Like in 2022?” It identified 23 technology areas that we can expect to disrupt the state-of-the-art. These range from medical robotics to big data and analytics to photonics to 3D integrated circuits to quantum computing.

The unifying theme for all these technologies is “seamless intelligence,” where everything is connected through ubiquitous networks and interfaces. “We project that by 2022,” the authors of the report say, “society will advance so that intelligence becomes seamless and ubiquitous for those who can afford and use state-of-the-art information technology.”

The IEEE report is a bit different from similar attempts at predicting the future because it comes from technologists, some in academia but others who work at corporate research labs, and is based in part on a survey of members of the IEEE Computer Society. Typically, predictions are the stock-in-trade of think tanks and research firms. Last year, for example, the McKinsey Global Institute published “Disruptive technologies: Advances that will transform life, business, and the global economy,” identifying 12 technologies that “could drive truly massive economic transformations and disruptions in the coming years.” Earlier this year, Juniper Research published “The World in 2020: A Technology Vision,” identifying 11 key technologies that it believes “will become the most disruptive by 2020.”

Beyond the use of the word “disruptive,” there are other commonalities between the three reports. Robotics and drones, 3D printing, the Internet of Things and wearables, self-driving cars, and cloud computing appear in all or at least two of the reports. But, for the most part, there is not a whole lot of agreement on the disruptive technologies of the future. Photonics, real-time translation, and renewable energy, for example, appear in only one of the reports.

The IEEE report opens with the famous Yogi Berra quote: “It’s tough to make predictions, especially about the future.” In the rest of this post, I will discuss three reasons why.

  1. Innovations that have made a strong impact on us in recent times obscure more important recent innovations.

The first item on The New York Times’ list of greatest inventions of the 19th century, published in 1899, was friction matches, introduced in their modern form in 1827. “For somebody to whom the electric light was as recent an innovation as the VCR is to us, the instant availability of fire on demand had indeed been one of the greatest advances of the century,” wrote Frederick Schwartz in 2000 in Invention & Technology. Which invention of the last 100 years or even 10 years is overshadowing an even more important invention of recent years?

  1. The road taken is no less important than the end result.

Another difficulty in predicting what the world will look like in just 5 or 7 years from now, is that some predictions eventually become a reality but they still miss altogether exactly how we are going to get there. Often, this is the most important (and practical) part of the prediction. To paraphrase Lewis Carroll, if you know where you are going, it matters a lot which road you are taking.

Many commentators writing this month about the 50th anniversary of Gordon Moore’s article charting a future course for the semiconductor industry (what became to be known as “Moore’s Law”), mentioned his predictions regarding home computers and “personal portable communications equipment.” But they ignored Moore’s prediction that “the biggest potential lies in the production of large systems. In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. Integrated circuits will also switch telephone circuits and perform data processing.”

Moore was right that integrated circuits will have an impact on large systems but failed to see that “the biggest potential” of the constant and predictable miniaturization he forecasted will be in smaller and smaller devices, in ubiquitous computing. In 1965, it was difficult to see that centralized systems will be replaced by distributed, anywhere computing. Which is why Moore added to his use of the term “home computers”—“or at least terminals connected to a central computer.”

  1. We extrapolate from the present and ignore or misunderstand non-technological factors.

Many predictions are what the forecasters want the future to be or simply an extension of what they are familiar and comfortable with. I have in my files a great example of the genre, a report published in 1976 by the Long Range Planning Service of the Stanford Research Institute (SRI), titled “Office of the Future.”

The author of the report was a Senior Industrial Economist at SRI’s Electronics Industries Research Group, and a “recognized authority on the subject of business automation.” His bio blurb indicates that he “also worked closely with two of the Institute’s engineering laboratories in developing his thinking for this study. The Augmentation Research Center has been putting the office of the future to practical test for almost ten years… Several Information Science Laboratory personnel have been working with state-of-the-art equipment and systems that are the forerunners of tomorrow’s products. The author was able to tap this expertise to gain a balanced picture of the problems and opportunities facing office automation.”

And what was the result of all this research and analysis? The manager of 1985, the report predicted, will not have a personal secretary. Instead he (decidedly not she) will be assisted, along with other managers, by a centralized pool of assistants (decidedly and exclusively, according to the report, of the female persuasion). He will contact the “administrative support center” whenever he needs to dictate a memo to a “word processing specialist,” find a document (helped by an “information storage/retrieval specialist”), or rely on an “administrative support specialist” to help him make decisions.

Of particular interest is the report’s discussion of the sociological factors driving the transition to the “office of the future.” Forecasters often leave out of their analysis the annoying and uncooperative (with their forecast) motivations and aspirations of the humans involved. But this report does consider sociological factors, in addition to organizational, economic, and technological trends. And it’s worth quoting at length what it says on the subject:

“The major sociological factor contributing to change in the business office is ‘women’s liberation.’ Working women are demanding and receiving increased responsibility, fulfillment, and opportunities for advancement. The secretarial position as it exists today is under fire because it usually lacks responsibility and advancement potential. The normal (and intellectually unchallenging) requirements of taking dictation, typing, filing, photocopying, and telephone handling leave little time for the secretary to take on new and more demanding tasks. The responsibility level of many secretaries remains fixed throughout their working careers. These factors can negatively affect the secretary’s motivation and hence productivity. In the automated office of the future, repetitious and dull work is expected to be handled by personnel with minimal education and training. Secretaries will, in effect, become administrative specialists, relieving the manager they support of a considerable volume of work.”

Regardless of the women’s liberation movement of his day, the author could not see beyond the creation of a 2-tier system in which some women would continue to perform dull and unchallenging tasks, while other women would be “liberated” into a fulfilling new job category of “administrative support specialist.”  In this 1976 forecast, there are no women managers.

But this is not the only sociological factor the report missed. The most interesting sociological revolution of the office in the 1980s – and one missing from most (all?) accounts of the PC revolution – is what managers (male and female) did with their new word processing, communicating, calculating machine. They took over some of the “dull” secretarial tasks that no self-respecting manager would deign to perform before the 1980s.

This was the real revolution: The typing of memos (later emails), the filing of documents, the recording, tabulating, and calculating. In short, a large part of the management of office information, previously exclusively in the hands of secretaries, became in the 1980s (and progressively more so in the 1990s and beyond) an integral part of managerial work.

This was very difficult, maybe impossible, to predict. It was a question of status. No manager would type before the 1980s because it was perceived as work that was not commensurate with his status. Many managers started to type in the 1980s because now they could do it with a new “cool” tool, the PC, which conferred on them the leading-edge, high-status image of this new technology. What mattered was that you were important enough to have one of these cool things, not that you performed with it tasks that were considered beneath you just a few years before.

What was easier to predict was the advent of the PC itself. And the SRI report missed this one, too, even though it was aware of the technological trajectory: “Computer technology that in 1955 cost $1 million, was only marginally reliable, and filled a room, is now available for under $25,000 and the size of a desk. By 1985, the same computer capability will cost less than $1000 and fit into a briefcase.”

But the author of the report (just like Gordon Moore in 1965) could only see a continuation of the centralized computing of his day. The report’s 1985 fictional manager views documents on his “video display terminal” and the centralized (and specialized) word processing system of 1976 continues to rule the office ten years later.

This was a failure to predict how the computer that will “fit into a briefcase” will become personal, i.e., will take the place of the “video display terminal” and then augment it as a personal information management tool. And the report also failed to predict the ensuing organizational development in which distributed computing replaced or was added to centralized computing.

Yes, predicting is hard to do. But compare forecasters and “analysts” with another human subspecies: Entrepreneurs.  Entrepreneurs don’t predict the future; they make it happen.

A year before the SRI report was published, in January 1975, Popular Electronics published a cover story on the first do-it-yourself PC or what they called  “first minicomputer kit,” the Altair 8800. Paul Allen and Bill Gates, Steve Jobs and Steve Wozniak, founded their companies around the time the SRI report was published not because they read reports about the office of the future. They simply imagined it.

Update: Gordon Moore quoted in VentureBeat “Once I made a successful prediction, I avoided making another.” and “I wish I had seen the applications earlier. To me the development of the Internet was a surprise. I didn’t realize it would open up a new world of opportunities.”

Originally published on Forbes.com