Two new reports on big data and big decisions were released last week by Accenture and PwC. Both reports shed new light on the impact of big data on enterprises today, and how it is changing the process of decision making by senior executives.
An Accenture worldwide survey of senior technology and business executives has found that companies that have completed at least one big data project are happy with the results. 60% of executives said their companies have successfully completed a big data implementation, 36% haven’t pursued a big data project yet and 4% were currently pursuing but hadn’t finished their first big data project. 92% of executives from companies that have completed a big data implementation are satisfied with the results and 89% rated big data as “very important” or “extremely important” to their businesses’ digital transformation.
Other highlights include:
In welcome news to the graduates of the numerous data science and business analytics programs, nearly all (91%) companies expect to increase their data science expertise, the majority within the next year. 54% of executives said their companies have already developed internal technical training opportunities for their employees and most organizations also tap outside expertise. Only 5% of respondents said their company used only internal resources for their big data implementations.
The report also reveals that “many companies have different definitions of big data.” Indeed, in answer to the question “Which of the following do you consider part of big data?” responses varied from “Large data files” (65%), to “Advanced analytics or analysis” (60%) to “Data from visualization tools (50%).”
Regardless of how they define big data, 89% believe big data will revolutionize business operations in the same way the Internet did. 85% feel big data will dramatically change the way they do business and 79% agree that “companies that do not embrace big data will lose their competitive position and may even face extinction.” More than anything, in my opinion, these answers reflect the persuasive powers of a buzzword, inherent in its dual, attention-getting role: The threat of “disruption” and the promise of a “revolution” (two words used to great effect by the report).
The PwC report, titled “Gut & Gigabytes: Capitalising on the art & science in decision making,” is based on an Economist Intelligence Unit (EIU) worldwide survey of 1,135 senior executives. It clearly defines big data as “the recent wave of electronic information produced in greater volume by a growing number of sources (i.e. not just data collected by a particular organisation in the course of normal business).”
Intuition has got a bad rap in the age of big data and the most interesting finding of this report is that 30% of executives admit that intuition is what they most relied on when they made their last big decision. An additional 28% relied on other people’s intuition (“advice or experience of others internally”). Only 30% said that “data and analysis (internal or external)” is what they relied on for their last big decision and another 9% relied on “financial indicators.”
The reliance on intuition and experience is based on… experience. 46% of executives said that relying on data analysis has been detrimental to their business in the past. They are concerned about quality, accuracy and completeness of data and find it difficult to access useful data.
Still, 64% of the executives surveyed said that big data has changed decision-making in their organizations and 25% expect it will do so over the next two years. And 49% of executives agree that data analysis is undermining the credibility of intuition or experience, compared with 21% who disagree. Says the report: “In reality, however, experience and intuition, and data and analysis, are not mutually exclusive. The challenge for business is how best to marry the two. A ‘gut instinct’ nowadays is likely to be based on increasingly large amounts of data, while even the largest data set cannot be relied upon to make an effective big decision without human involvement.”
[Originally published on Forebs.com]
Last week I got an email from UC Berkeley’s Master of Information and Data Science program, asking me to respond to a survey of data science thought leaders, asking the question “What is big data”? I was especially delighted to be regarded as a “thought leader” by Berkeley’s School of Information, whose previous dean, Hal Varian (now chief economist at Google, answered my challenge fourteen years ago and produced the first study to estimate the amount of new information created in the world annually, a study I consider to be a major milestone in the evolution of our understanding of big data.
The Berkeley researchers estimated that the world had produced about 1.5 billion gigabytes of information in 1999 and in a 2003 replication of the study found out that amount to have doubled in 3 years. Data was already getting bigger and bigger and around that time, in 2001, industry analyst Doug Laney described the “3Vs”—volume, variety, and velocity—as the key “data management challenges” for enterprises, the same “3Vs” that have been used in the last four years by just about anyone attempting to define or describe big data.
The first documented use of the term “big data” appeared in a 1997 paper by scientists at NASA, describing the problem they had with visualization (i.e. computer graphics) which “provides an interesting challenge for computer systems: data sets are generally quite large, taxing the capacities of main memory, local disk, and even remote disk. We call this the problem of big data. When data sets do not fit in main memory (in core), or when they do not fit even on local disk, the most common solution is to acquire more resources.”
In 2008, a number of prominent American computer scientists popularized the term, predicting that “big-data computing” will “transform the activities of companies, scientific researchers, medical practitioners, and our nation’s defense and intelligence operations.” The term “big-data computing,” however, is never defined in the paper.
The traditional database of authoritative definitions is, of course, the Oxford English Dictionary (OED). Here’s how the OED defines big data: (definition #1) “data of a very large size, typically to the extent that its manipulation and management present significant logistical challenges.”
But this is 2014 and maybe the first place to look for definitions should be Wikipedia. Indeed, it looks like the OED followed its lead. Wikipedia defines big data (and it did it before the OED) as (#2) “an all-encompassing term for any collection of data sets so large and complex that it becomes difficult to process using on-hand data management tools or traditional data processing applications.”
While a variation of this definition is what is used by most commentators on big data, its similarity to the 1997 definition by the NASA researchers reveals its weakness. “Large” and “traditional” are relative and ambiguous (and potentially self-serving for IT vendors selling either “more resources” of the “traditional” variety or new, non-“traditional” technologies).
The widely-quoted 2011 big data study by McKinsey highlighted that definitional challenge. Defining big data as (#3) “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze,” the McKinsey researchers acknowledged that “this definition is intentionally subjective and incorporates a moving definition of how big a dataset needs to be in order to be considered big data.” As a result, all the quantitative insights of the study, including the updating of the UC Berkeley numbers by estimating how much new data is stored by enterprises and consumers annually, relate to digital data, rather than just big data, e.g., no attempt was made to estimate how much of the data (or “datasets”) enterprises store is big data.
Another prominent source on big data is Viktor Mayer-Schönberger and Kenneth Cukier’s book on the subject. Noting that “there is no rigorous definition of big data,” they offer one that points to what can be done with the data and why its size matters:
(#4) “The ability of society to harness information in novel ways to produce useful insights or goods and services of significant value” and “…things one can do at a large scale that cannot be done at a smaller one, to extract new insights or create new forms of value.”
In Big Data@Work, Tom Davenport concludes that because of “the problems with the definition” of big data, “I (and other experts I have consulted) predict a relatively short life span for this unfortunate term.” Still, Davenport offers this definition:
(#5) “The broad range of new and massive data types that have appeared over the last decade or so.”
Let me offer a few other possible definitions:
(#6) The new tools helping us find relevant data and analyze its implications.
(#7) The convergence of enterprise and consumer IT.
(#8) The shift (for enterprises) from processing internal data to mining external data.
(#9) The shift (for individuals) from consuming data to creating data.
(#10) The merger of Madame Olympe Maxime and Lieutenant Commander Data.
#(11) The belief that the more data you have the more insights and answers will rise automatically from the pool of ones and zeros.
#(12) A new attitude by businesses, non-profits, government agencies, and individuals that combining data from multiple sources could lead to better decisions.
I like the last two. #11 is a warning against blindly collecting more data for the sake of collecting more data (see NSA). #12 is an acknowledgment that storing data in “data silos” has been the key obstacle to getting the data to work for us, to improve our work and lives. It’s all about attitude, not technologies or quantities.
What’s your definition of big data?
See here for the compilation of Big data definitions from 40+ thought leaders.
[Originally published on Forbes.com]
Now that it has been established that the Internet of Things is the most hyped “emerging technology” today, and that the term—and the associated technologies—is far from being new, the only question to be answered is Why the sudden surge in interest in 2014?
That’s the question I put to a number of tech luminaries earlier this year. Bob Metcalfe, inventor of the Ethernet and now Professor of Innovation at University of Texas at Austin, is familiar with the sudden prominence of technologies, coming after lengthy incubation periods. Metcalfe points to scribbles like me as the main culprit: “It’s a media phenomenon. Technologies and standards and products and markets emerge slowly, but then suddenly, chaotically, the media latches on and BOOM!—It’s the year of IoT.” Hal Varian, Chief Economist at Google, believes Moore’s Law has something to do with the newfound interest in the IoT: “The price of sensors, processors, and networking has come way down. Since WiFi is now widely deployed, it is relatively easy to add new networked devices to the home and office.”
Janus Bryzek, known as “the father of sensors” (and a VP at Fairchild Semiconductor), thinks there are multiple factors “accelerating the surge” in interest. First, there is the new version of the Internet Protocol, IPv6, “enabling almost unlimited number of devices connected to networks.” Another factor is that four major network providers—Cisco, IBM, GE and Amazon—have decided “to support IoT with network modification, adding Fog layer and planning to add Swarm layer, facilitating dramatic simplification and cost reduction for network connectivity.” Last but not least, Bryzek mentions new forecasts regarding the IoT opportunity, with GE estimating that the “Industrial Internet” has the potential to add $10 to $15 trillion (with a “T”) to global GDP over the next 20 years, and Cisco increasing to $19 trillion its forecast for the economic value created by the “Internet of Everything” in the year 2020. “This is the largest growth in the history of humans,” says Bryzek.
These mind-blowing estimates from companies developing and selling IoT-related products and services, no doubt have helped fuel the media frenzy. But what do the professional prognosticators say? Gartner estimates that IoT product and service suppliers will generate incremental revenue exceeding $300 billion in 2020. IDC forecasts that the worldwide market for IoT solutions will grow from $1.9 trillion in 2013 to $7.1 trillion in 2020.
Other research firms focus on slices of this potentially trillion-dollar market such as connected cars, smart homes, and wearables. Here’s a roundup of estimates and forecasts for various segments of the IoT market:
ABI Research: The installed base of active wireless connected devices will exceed 16 billion in 2014, about 20% more than in 2013. The number of devices will more than double from the current level, with 40.9 billion forecasted for 2020. 75% of the growth between today and the end of the decade will come from non-hub devices: sensor nodes and accessories. The chart above is from ABI’s research on smart cars.
Acquity Group (Accenture Interactive): More than two thirds of consumers plan to buy connected technology for their homes by 2019, and nearly half say the same for wearable technology. Smart thermostats are expected to have 43% adoption in the next five years (see chart below).
IHS Automotive: The number of cars connected to the Internet worldwide will grow more than sixfold to 152 million in 2020 from 23 million in 2013.
Navigant Research: The worldwide installed base of smart meters will grow from 313 million in 2013 to nearly 1.1 billion in 2022.
Morgan Stanley: Driverless cars will generate $1.3 trillion in annual savings in the United States, with over $5.6 trillions of savings worldwide.
Machina Research: Consumer Electronics M2M connections will top 7 billion in 2023, generating $700 billion in annual revenue.
On World: By 2020, there will be over 100 million Internet connected wireless light bulbs and lamps worldwide up from 2.4 million in 2013.
Juniper Research: The wearables market will exceed $1.5 billion in 2014, double its value in 2013–
Endeavour Partners: As of September 2013, one in ten U.S. consumers over the age of 18 owns a modern activity tracker. More than half of U.S. consumers who have owned a modern activity tracker no longer use it. A third of U.S. consumers who have owned one stopped using the device within six months of receiving it.
Originally published on Forbes.com
Bruce Leichtman, president and principal analyst for Leichtman Research Group, Inc.:”With the addition of more than 30 million broadband subscribers over the past decade, cable providers have clearly expanded well beyond their roots in cable TV service. As of the end of 2Q 2014, the top cable providers now have more broadband subscribers than cable TV subscribers.”
Peter Kafka, Re/code:”The top cable guys now have 49,915,000 Internet subscribers, compared to 49,910,000 TV subscribers. And to be sure, most cable customers are getting both services. Still, this is directionally important. The future for the pay TV guys isn’t selling you pay TV — it’s selling you access to data pipes, and pay TV will be one of the things you use those pipes for.”
Marcus Wholsen, Wired: “What this means for the future of TV is still tough to predict. While these figures may suggest the inevitable transition to an internet-dominated future, nearly 50 million cable subscribers don’t appear ready to cut the cord just yet. Even with a plethora of on-demand options, people are still watching TV like they used to, which means a business model still based around ads and subscription fees. But that’s still a loss of millions of cable subscribers over the past half-decade, while the number of broadband subscribers has climbed at a much faster clip.
Meanwhile, traditional TV as a format already is being engulfed by the open-endedness of the internet. From mainstream streaming services like Netflix, Hulu, and Amazon Instant Video to niche sites like Funny or Die to YouTube celebrities—to name just some of the options that fall under entertainment—the kinds of moving pictures available and the ways to consume them have never been greater. Within this broader spectrum, cable as a concept could become just another niche, one channel among many as the insatiable internet swallows everything it encounters.”