Internal big data skills trump outside consulting help

Big data skills are in short supply in many companies. But some IT managers say it’s more fruitful to develop them in-house than to rely on outside help.

This article can also be found in the Premium Editorial Download: Business Information: Big data technology: Beyond the trendy tools:

Big data projects rely on advanced skill sets that are in short supply in many organizations. That might be a big opening for consultants offering outside help. For example, research organization Wikibon estimates that worldwide professional services revenue from big data deployments will vault from $3.87 billion last year to $15.38 billion in 2017.

But ensuring that your organization has ample big data talent is important to the success of big data applications, said Neeraj Kumar, vice president of enterprise architecture at Cardinal Health, a distributor of pharmaceuticals and medical products in Dublin, Ohio, that's in the early stages of working with Hadoop. "Innovation does not happen via consultants," he said. "It happens when you have skin in the game, when you have people working with [the data] who understand the business and understand the problems."

Kumar sees bringing in consultants to help train internal staffers on big data management and analytics technologies as a valid way to get started on projects. After that, though, "you need people in-house who can take the project forward," he said.

That's the approach taken by Edmunds.com Inc. Paddy Hannon, its vice president of architecture, said the online publisher of automobile pricing data and other car-shopping information hasn't used outside consultants on a Hadoop deployment that began in 2011. Instead, a team of Java developers learned how to use a slew of new open source technologies to drive the Santa Monica, Calif., company's efforts to tap Hadoop and related big data tools as an alternative to an existing relational data warehouse that was running out of steam on scalability.

In a June blog post, Edmunds Chief Information Officer Philip Potloff wrote that the early months of the initiative were slow going, as team members mastered the intricacies of the Hadoop Distributed File System, MapReduce, HBase, Oozie, Hive and Pig -- all tools in the Hadoop ecosystem. But he said the company made the switch to the new Hadoop-based data warehouse system last February and now is supporting all of its reporting operations with information from the new setup.

This was first published in August 2013

Dig deeper on Data quality techniques and best practices

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

1 comment

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSOA

SearchSQLServer

Close