News Stay informed about the latest enterprise technology news and product updates.

Strata big data conference sees tools edging toward chasm

At the Strata big data conference, Geoffrey Moore looked to apply Chasm Theory to Hadoop and the like, as described in our podcast.

Hadoop, Hive and MapReduce have garnered mountains of publicity of late, but viewers at the recent Strata big data conference were looking to see if there was a breakout killer app in the offing, SearchDataManagement's Jack Vaughan tells SearchBusinessAnalytics' Ed Burns in this Talking Data podcast. Big data use cases were among the topics of conjecture at Strata 2014 in Santa Clara, Calif., Vaughan said.

At Strata, data scientists and Hadoop and NoSQL developers heard keynoter Geoffrey Moore, author of Crossing the Chasm, apply his theory of technology disruption and adoption to the big data phenomenon.

A gap typically exists between early adopters and an early majority of technology users, he said, placing Hadoop and related big data tools firmly on the "enthusiast" side of the divide today.

Niche-oriented Hadoop and whimsically named associated tools have the potential to go mainstream in 2015, Moore said, which is a lot later than many other big data odds-makers may have projected.

In fact, deals and alliances still seem to be the order of the day for Hadoop and dedicated Hadoop distribution providers such as Cloudera, MapR and Hortonworks, which forged pacts with DataBricks, HP Vertica and Red Hat, respectively, at the Strata big data conference. The latter move bears special attention, as it pairs a well-heeled Hadoop startup with a well-established open source software leader.

Hortonworks and Red Hat formed a strategic alliance that includes integration of product lines, marketing initiatives and collaborative customer support. Also on tap is a beta program for the Hortonworks Data Platform, or HDP, plug-in for Red Hat Storage.

In reviewing the Red Hat-Hortonworks news, 451 Group analyst Matt Aslett said systems vendors were desperate not to get into a repeat of the Unix wars of yore. He said that doesn't seem to be happening with Hadoop -- thanks in part to the fact that Apache Hadoop has  a broader "core" than the Linux kernel.

Listen to this seven-minute podcast and read more about the topic in "Seeking Hadoop best practices for production." During the podcast you will:

  • Find a discussion of best practices among Hadoop implementers
  • Discover technology guru Geoffrey Moore's prognostications for big data technology
  • Hear about a Cisco data architect's "late adoption" plans for Hadoop

Jack Vaughan is SearchDataManagement's news and site editor. Email him at, and follow us on Twitter: @sDataManagement.

Dig Deeper on Hadoop framework

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.