Competing on AI: The New ‘New Science of Winning’

Data is eating the world, one buzzword at a time.

In 2017, The Economist declared in “Data is Giving Rise to a New Economy”: “Data are to this century what oil was to the last one—a driver of growth and change.” And IDC estimated that by 2025 we will create 163 trillion gigabytes of data, ten times more than in 2016.

CompetingAlso in 2017, the Harvard Business Review Press published an updated and expanded 10th anniversary edition of Competing on Analytics: The New Science of Winning by Tom Davenport and Jeanne Harris. More than 150,000 copies of the book have been sold and it has been translated into over 12 languages. Launching a data appreciation movement, the book has served as a catalyst for the establishment of numerous analytics departments in large enterprises and many new “business analytics” undergraduate and graduate training programs.

It is not often that the originators of a new business and/or technology buzzword get to review the evolution of their creation ten years later. Typically, the latest new new thing is promoted by technology vendors, industry analysts, and consultants, all eager to differentiate themselves from the competition and to establish (thought) leadership in a new market segment, product category or world-changing technology. The most important function buzzwords serve is to provide a new rationale and a new incentive for potential customers to buy new products and services. Failing to do so, technology vendors predict, will ensure that they will be “disrupted” by their competitors.

Buzzwords, however, have been only a superficial veneer of seemingly “revolutionary” (did I mention “disruptive”?) change on top of a steady evolution of computer technology since the late 1940s, driven by the increasingly sophisticated and varied use of the key product of computers, i.e., digital data. It is easier for sellers and buyers of technology-based products and services to promulgate and consume “the new new thing,” especially when it’s encapsulated in a nifty buzzword, rather than engage in a long-drawn discussion of what the new stage in the evolution of data and its uses really represents.

 

The new edition of Competing on Analytics provides a useful overview of the latest stages of the evolution of data or what the authors call the “3 massive changes in how analytics is practiced since 2007.” When the first edition of the book was published ten years ago, it highlighted the successful companies of the “Analytics 1.0” era, the ones using mostly descriptive analytics to help them understand better and derive lessons from their past performance. Data was mostly used to support (or not) business decisions.

But in 2007, a number of new companies, all Internet-related businesses, were already defining the “Analytics 2.0” era, analyzing data created online, unstructured as opposed to structured data, external as opposed to internal data, helping them understand better where in the future their business will be. “These companies competed on analytics perhaps more than any of the others we wrote about in the first version of this book,” write Davenport and Harris. (By using the term “analytics,” they were smartly applying to what was before called “business intelligence” or “data mining,” a term popularized at the time—in a different context—by Google Analytics).

The business of these new companies reflected a new appreciation for data not as a by-product of computer technology, but as the product itself, as what their business was all about, including expecting their customers to pay for their services with data rather than dollars. What they did with the data—developing new tools and techniques for storing, processing and analyzing huge volumes of data—represented a new stage in the evolution of applying computers to statistical analysis, a process that started with the very first digital computers (e.g., simulation).

The new appreciation for data-as-the-business led to the creation of a new breed of data analysis experts—”data scientists”—with both software engineering and statistical analysis skills. As data was the product they became the new product managers and as the data was at their fingertips, they excelled at experimentation, simulating the potential risks and rewards of multiple business scenarios. The role became the “sexiest job of the 21st century” (as Davenport and D.J. Patil wrote in the Harvard Business Review), driving the rapid proliferation of “data science” training programs and research centers.

Around 2011, the data appreciation movement reached all businesses (and non-profits and government agencies) in the form of a new buzzword, “Big Data.” Calling this stage “Analytics 3.0,” Davenport and Harris describe it as data and analytics becoming “mainstream business resources” and the use of data for the creation of “new products and services.” This latter aspect of the new—mainstream—appreciation of data, of data as a business, became known as another new buzzword, “digital transformation.”

“Big Data” was quickly eclipsed by this and other buzzwords—“Internet of Things,” for example—all marking new aspects, new uses, new applications, of the 70-year-old digital enterprise of generating and accumulating new streams of data and, most important, trying to “monetize” it (yes, another buzzword), i.e., to profit from it.

We have now entered the “Analytics 4.0” era, the “rise of autonomous analytics,” write Davenport and Harris. To my mind, it’s the best example so far in the evolution of data appreciation. While “what has been will be again,” it sometimes arrives with a slight (never “revolutionary”) improvement. The buzzword today is “Artificial Intelligence” (or “cognitive computing,” as IBM, the inventor of “data processing” in the 1950s, calls it).

The new new thing (such as getting computers to excel in object identification) has very little to do with what the pioneers of AI meant when they started using the term in the mid-1950s and everything to do with data science (combining statistical analysis and computer engineering) and big data (specifically with using “crowdsourcing”—yet another buzzword—to label millions of online photos which then are used to “train” computers in object identification). A more accurate label would be “advanced machine learning” but this does not meet the required “sexiness” quotient of a successful buzzword.

It doesn’t matter what label or buzzword we use, as long as we understand what’s really behind it, understanding that helps reduce hype and obfuscation and improves the chances of success when deploying the new new thing in a business context.

That’s the role books like Competing on Analytics play, guiding business executives through the challenges of understanding and adopting new tools and technologies. In general, they guide their readers to put less of an emphasis on the new technology and more on the people using it and how it could be integrated smoothly with existing work processes. Davenport and Harris write: “The star companies of Competing on Analytics didn’t always use the latest tools, but they were very good at building their strategies and business models around their analytics capabilities. They made data and analytics an integral component of their cultures.”

What has not changed in the last 10 years, according to Davenport and Harris, are the challenges of developing the right organizational culture, the role of leadership, and focusing on pressing business problems. All these are “still the hardest today,” they write.

My conclusion? Competing on AI is no different from Competing on Analytics. Technology steadily evolves and advances in computer technology have driven a steady evolution in data appreciation. Human nature does not evolve and people must always be taken into account when embracing the latest stage in the evolution of technology.

Originally published on Forbes.com