Managing Hadoop projects: What you need to know to succeed
A comprehensive collection of articles, videos and more, hand-picked by our editors
One of the speakers at the 2014 TDWI Executive Summit in Boston was Ryan Fenner, a vice president and enterprise data solutions architect at MUFG Union Bank in San Francisco. Fenner is a former shadow IT worker who now is focused on finding a sustainable balance between centralized and distributed management of analytics data as the technical leader on a data warehousing project at the bank. In his presentation, he detailed MUFG Union Bank's efforts to address data quality, data integration and master data management issues, and spoke about the challenges the bank faced as it implemented an agile approach to data warehouse development.
At the summit, Fenner also discussed the potential value of Hadoop technology in a video interview with SearchDataManagement. Fenner shared his thoughts on whether the Apache Hadoop distributed processing framework has become a must-have technology for organizations, and if Hadoop clusters will replace traditional data warehouses as repositories for business intelligence and analytics data, particularly to support big data analytics applications.
Deploying Hadoop software isn't the big challenge, Fenner said: "At the end of the day, it's really about the use case." If you can get a return on investment by using Hadoop, it's worth going ahead, he added -- but if a deployment can't be justified financially, there's no need to deploy Hadoop just for the sake of doing so. And Fenner sees Hadoop systems and data warehouses as complementary technologies. "It's not one or the other," he said. "It's truly about which use case goes with which platform."
Watch the two-minute video to hear more of what Fenner had to say about adopting Hadoop technology in enterprise organizations.