Essential Guide

Browse Sections
This content is part of the Essential Guide: Managing Hadoop projects: What you need to know to succeed
News Stay informed about the latest enterprise technology news and product updates.

New Hadoop projects aim to boost interoperability, data lake benefits

By now, many companies understand the concept of big data. Figuring out what to do with useful bits of information is more elusive.

In this episode of BizApps Today, editors Jason Sparapani and Craig Stedman explain how data lake analytics and interoperability can improve efforts to sift through big data in Apache Hadoop projects. Hadoop is an open source distributed programming framework that is part of many big data applications.

Among the newest efforts in this regard is the Open Data Platform (ODP) initiative, which seeks to establish a set of standards to promote Hadoop interoperability.

Some vendors sell add-ons to Hadoop's base technology, which has led Pivotal and Hortonworks to launch the ODP initiative, says Stedman, executive editor of SearchDataManagement and SearchBusinessAnalytics. The ODP initiative's goal is to certify a set of compatible products that users could mix and match, although some leading Hadoop vendors aren't on board, including Cloudera and MapR.

"There's kind of a schism on this," Stedman tells host Joe Hebert.

More on this topic

Don't forget Hadoop security when building your data lake

How to assess the real cost of open source Hadoop

Similarly, attendees at the recent Strata + Hadoop World 2015 conference expressed mixed reactions to the ODP initiative. "Some of the users we talked to at the conference were concerned that having some vendors in and others out might make it harder to switch products, not easier," Stedman says. "So I think this bears watching to see what's really going to come out of this."

Meanwhile, the February 2015 issue of Business Information continues the big data theme with a look at how companies can get value out of wisely using big data analytics. The cover story features insurance carrier Allstate, which uses a data lake to improve its business processes. As explained in an earlier BizApps Today video, a data lake is a repository based on Hadoop.

The insurance company uses its data lake "to figure out when it's OK for Allstate's inspectors to skip inspections before underwriting homeowner policies," says Sparapani, managing editor for Business Information.

The new issue also features the debut of "The Corner Office," a column by Celso Mello, the CIO at Reliance Home Comfort, a supplier of home heating and cooling systems throughout Canada.

"We cover a lot of the wants and needs of IT managers and admins, but we also want to cover the strategy side, because underlying IT is the need to really align with business goals," Sparapani says. "So we go right to the source, which is the CIO."

Mello's first column discusses how to become a CIO through five strategies that help gain visibility into the business.

How do you feel about the state of big data applications? Is there enough interoperability built in Hadoop projects and processes? Let us know what you think in the comments section below.

Text by Scott Wallask, news director for TechTarget's Business Applications Group. Email him at [email protected] and follow him on Twitter: @Scott_HighTech.

View All Videos