deeplearning

In Pressed Data, my Forbes.com column, I try to chronicle the evolution of digital technologies, their business impact, and the people behind the innovations, business models, and new ideas. In 2016, I covered artificial intelligence—the 60-year-old new new thing, big data—the most recent hottest trend and a catalyst for the new-found popularity of the new one, the fading away of former tech leaders, a number of startups, and a number of influential business and tech innovators. These were the highlights:

When Artificial Intelligence Started To ‘Change The World’

A review of ENIAC in Action: Making and Remaking the Modern Computer, “a nuanced, engaging and thoroughly researched account of the early days of computers, the people who built and operated them, and their old and new applications,” contrasting it with “history as hype, offering a distorted view of the past, sometimes through the tinted lenses of contemporary fads and preoccupations.”

A New Documentary Reveals A One-Dimensional Face Of Big Data

Hype is on full display in “The Human Face of Big Data” of which I wrote: “…in our technology-obsessed world, new technologies and new technology applications tend sometimes to become buzzwords that are hyped, celebrated and often discussed irresponsibly by technology vendors and the media. Unfortunately, ‘The Human Face of Big Data’ by and large falls into this trap, the fascination (self-delusion?) with the idea of we are living a momentous time in history thanks to technology.”

Top 10 Hot Big Data Technologies

The hype—and an ambiguous and ill-defined term—does not mean that there is no value in adopting and applying the set of technologies that can be classified as big data technologies. In TechRadar: Big Data, Q1 2016, Forrester Research evaluated the maturity and trajectory of 22 technologies across the entire data life cycle.

Why Yahoo Lost And Google Won

Many of the Yahoo obituaries published in 2016 contrasted its demise with the flourishing of Google, another Web pioneer. Why was Google’s attempt to “organize all the world’s information” vastly more successful than Yahoo’s? The short answer: Because Google did not organize the world’s information. Google got the true spirit of the Web, as it was invented by Tim Berners-Lee. Following the latter’s disdain for pre-defined classification systems and taxonomies, Google’s founders built their information retrieval business on tracking closely cross-references (i.e., links between pages) as they were happening and correlating relevance with quantity of cross-references (i.e., popularity of pages as judged by how many other pages linked to them). In contrast, Yahoo had a “Chief Ontologist” on staff. As happens often in the cutting-edge technology business, new ideas are “revolutionary” only in the sense of revolving back to old ones: The concept of cross-references can be trace back to Ephraim Chambers’ Cyclopaedia, published in London in 1728.

The 3 Mindset Shifts You Need For A Successful Digital Transformation

Keri Gohman, Executive Vice President and Head of Small Business Banking at Capital One, on what businesses need to do to gain customer loyalty in the new digital environment.

AI And Machine Learning Take Center Stage At Intel Analytics Summit

From the most recent edition of the tech bible: Moore’s Law begat faster processing and cheap storage which begat machine learning and big data which begat deep learning and today’s AI Spring.

­­­Future Business Leaders As Data Scientists: Reflections On The Career Of Intel Chief Data Scientist

A profile of Bob Rogers, Chief Data Scientist for Big Data Solutions at Intel, the entrepreneurial data scientist who has successfully applied artificial intelligence to healthcare.

A Very Short History Of Artificial Intelligence (AI)

Milestones in the evolution of “thinking machines.”

Artificial Intelligence Pioneers: Peter Norvig, Google

A profile of Peter Norvig, Director of Research at Google.

Deep Learning Is Still A No-Show In Gartner 2016 Hype Cycle For Emerging Technologies

In 2016, Gartner has moved machine learning back a few notches from where it placed it on the previous year’s “Hype Cycle,” putting it at the peak of inflated expectations, and still estimating 2 to 5 years until mainstream adoption. Is machine learning an “emerging technology” and is there a better term to describe what most of the hype is about nowadays in tech circles?

Internet Of Things By The Numbers: What New Surveys Found

Things are looking up for the Internet of Things. 80% of organizations have a more positive view of IoT today compared to a year ago, according to a survey of 512 IT and business executives by CompTIA.

A Very Short History Of EMC Corporation

Milestones in the remarkable 37-year history of EMC Corporation, 16 of which I had the pleasure to witness in person.

Only Humans Need Apply Is A Must-Read On AI For Facebook Executives

In Only Humans Need Apply: Winners and Losers in the Age of Smart Machines, knowledge work and analytics expert Tom Davenport and Julia Kirby, a contributing editor for the Harvard Business Review, re-introduce the concept of augmentation to our discussion of the impact of AI on jobs—humans and computers combing “their strengths to achieve more favorable outcomes than either could do alone.”

12 Observations About Artificial Intelligence From The O’Reilly AI Conference

At the inaugural O’Reilly AI conference, 66 artificial intelligence practitioners and researchers from 39 organizations presented the current state-of-AI: From chatbots and deep learning to self-driving cars and emotion recognition to automating jobs and obstacles to AI progress to saving lives and new business opportunities.

Artificial intelligence (AI) And The Future Of Marketing: 6 Observations From Inbound 2016

At Inbound 2016, HubSpot’s co-founders Brian Halligan and Dharmesh Shah entertained 19,000 attendees with their take on the past and future of marketing.

2017 Predictions For AI, Big Data, IoT, Cybersecurity, And Jobs From Senior Tech Executives

Artificial intelligence (and machine/deep learning) is the hottest trend, eclipsing, but building on, the accumulated hype for the previous “new big thing,” big data. The new catalyst for the data explosion is the Internet of Things, bringing with it new cybersecurity vulnerabilities. The rapid fluctuations in the relative temperature of these trends also create new dislocations and opportunities in the tech job market.

Here’s to a productive and enjoyable 2017!