In Pressed Data, my Forbes.com column, I try to chronicle the evolution of digital technologies, their business impact, and the people behind the innovations, business models, and new ideas. In 2016, I covered artificial intelligence—the 60-year-old new new thing, big data—the most recent hottest trend and a catalyst for the new-found popularity of the new one, the fading away of former tech leaders, a number of startups, and a number of influential business and tech innovators. These were the highlights:
A review of ENIAC in Action: Making and Remaking the Modern Computer, “a nuanced, engaging and thoroughly researched account of the early days of computers, the people who built and operated them, and their old and new applications,” contrasting it with “history as hype, offering a distorted view of the past, sometimes through the tinted lenses of contemporary fads and preoccupations.”
Hype is on full display in “The Human Face of Big Data” of which I wrote: “…in our technology-obsessed world, new technologies and new technology applications tend sometimes to become buzzwords that are hyped, celebrated and often discussed irresponsibly by technology vendors and the media. Unfortunately, ‘The Human Face of Big Data’ by and large falls into this trap, the fascination (self-delusion?) with the idea of we are living a momentous time in history thanks to technology.”
The hype—and an ambiguous and ill-defined term—does not mean that there is no value in adopting and applying the set of technologies that can be classified as big data technologies. In TechRadar: Big Data, Q1 2016, Forrester Research evaluated the maturity and trajectory of 22 technologies across the entire data life cycle.
Many of the Yahoo obituaries published in 2016 contrasted its demise with the flourishing of Google, another Web pioneer. Why was Google’s attempt to “organize all the world’s information” vastly more successful than Yahoo’s? The short answer: Because Google did not organize the world’s information. Google got the true spirit of the Web, as it was invented by Tim Berners-Lee. Following the latter’s disdain for pre-defined classification systems and taxonomies, Google’s founders built their information retrieval business on tracking closely cross-references (i.e., links between pages) as they were happening and correlating relevance with quantity of cross-references (i.e., popularity of pages as judged by how many other pages linked to them). In contrast, Yahoo had a “Chief Ontologist” on staff. As happens often in the cutting-edge technology business, new ideas are “revolutionary” only in the sense of revolving back to old ones: The concept of cross-references can be trace back to Ephraim Chambers’ Cyclopaedia, published in London in 1728.
Keri Gohman, Executive Vice President and Head of Small Business Banking at Capital One, on what businesses need to do to gain customer loyalty in the new digital environment.
From the most recent edition of the tech bible: Moore’s Law begat faster processing and cheap storage which begat machine learning and big data which begat deep learning and today’s AI Spring.
A profile of Bob Rogers, Chief Data Scientist for Big Data Solutions at Intel, the entrepreneurial data scientist who has successfully applied artificial intelligence to healthcare.
Milestones in the evolution of “thinking machines.”
A profile of Peter Norvig, Director of Research at Google.
In 2016, Gartner has moved machine learning back a few notches from where it placed it on the previous year’s “Hype Cycle,” putting it at the peak of inflated expectations, and still estimating 2 to 5 years until mainstream adoption. Is machine learning an “emerging technology” and is there a better term to describe what most of the hype is about nowadays in tech circles?
Things are looking up for the Internet of Things. 80% of organizations have a more positive view of IoT today compared to a year ago, according to a survey of 512 IT and business executives by CompTIA.
Milestones in the remarkable 37-year history of EMC Corporation, 16 of which I had the pleasure to witness in person.
In Only Humans Need Apply: Winners and Losers in the Age of Smart Machines, knowledge work and analytics expert Tom Davenport and Julia Kirby, a contributing editor for the Harvard Business Review, re-introduce the concept of augmentation to our discussion of the impact of AI on jobs—humans and computers combing “their strengths to achieve more favorable outcomes than either could do alone.”
At the inaugural O’Reilly AI conference, 66 artificial intelligence practitioners and researchers from 39 organizations presented the current state-of-AI: From chatbots and deep learning to self-driving cars and emotion recognition to automating jobs and obstacles to AI progress to saving lives and new business opportunities.
At Inbound 2016, HubSpot’s co-founders Brian Halligan and Dharmesh Shah entertained 19,000 attendees with their take on the past and future of marketing.
Artificial intelligence (and machine/deep learning) is the hottest trend, eclipsing, but building on, the accumulated hype for the previous “new big thing,” big data. The new catalyst for the data explosion is the Internet of Things, bringing with it new cybersecurity vulnerabilities. The rapid fluctuations in the relative temperature of these trends also create new dislocations and opportunities in the tech job market.
Here’s to a productive and enjoyable 2017!