Listening to Gartner analysts Sheila Childs and Merv Adrian talk recently about big data infrastructure challenges, I was reminded of a story Mike Ruettgers, former EMC Chairman and CEO, liked to tell about similar challenges in the early 1990s.
At the time, the reigning buzzword was “client/server computing,” signaling a shift to relatively inexpensive servers based on the UNIX operating system. The early adopters were not the people in the glass houses, the data center managers. Rather, they were marketing managers and other business executives, eager to circumvent the ever-so-slow IT department and have complete control of new applications, especially the new data mining tools that allowed them to gain new insights from the big data explosion of the late 1980s and early 1990s. After all, they have already experienced for a few years the freedom and absolute control provided by the personal computer.
Visiting the CIO of John Deere, Ruettgers asked him whether he saw these “distributed systems” coming back to be managed by the IT department. As Ruettgers told Dan Morrow of the Computerworld Honors Program, “I got a reaction that I always remember. He got red and the veins started to pop out on his face and he said, ‘I told those bastards not to do it to start with. Now they want me to take it back. I’m only going to take it back if they crawl on their hands and knees over 100 yards of broken glass.’”
Is this the future of the big data projects popping up all around the corporate world? While the Gartner analysts thought IT should own big data because “there are a lot of technical issues,” they also said that “in many cases, these projects are going outside of IT.”
I’m sure that this is true for many of the big data pilots going on in large enterprises today, just as it was with data mining and data warehousing projects in the early 1990s. What’s more, today’s cloud-based infrastructures provide an easy-to-use and cost-effective sandbox for business executives’ experiments. IT is not involved and, in most cases, not prepared.
Here’s how listeners to the Gartner webcast voted on the following poll:
How ready is your IT organization to support big data?
15%: Our IT staff is trained and supporting big data use cases, providing infrastructure, data protection, security
0%: We know what we need to do and our staff is coming up to speed
54%: We don’t yet know what skills we will need, but we’re researching this
31%: We have no plans; we will deal with it when we need to
Yes, the listeners to the webcast were certainly not a representative sample (not that any survey nowadays is based on a representative sample) but the honest answers may have given us a good indicator of the state of IT readiness for big data.
Ready or not, the IT department can expect the business units to come back begging for help, just as they did in the 1990s. And even before business users start crawling back, I can imagine the red faces of all those IT executives that worked so hard in the last five years or so on establishing a “data governance” model for their companies and putting together master data management policies. The Garner analysts warned their listeners that big data “could break existing governance models.” This is similar to what Forrester analyst Boris Evelson wrote last month: “You may find that all of your best DW, BI, MDM practices for SDLC, PMO and Governance aren’t directly applicable to or just don’t work for Big Data. This is where the real challenge of Big Data currently lies. I personally have not seen a good example of best practices around managing and governing Big Data. If you have one, I’d love to see it!”
Is it déjà vu all over again?