In many organizations, cloud computing is unhinging the IT status quo and upending the traditional state of affairs for data architecture and management. As the use of Software-as-a-Service technology grows, often driven by individual business units and departments, the need to deftly connect SaaS data to on-premises systems grows as well.
Like it or not, enterprise data management professionals have to accommodate SaaS data integration requirements.
There are precedents, of course. The end user took center stage back in the client/server era, when PCs, spreadsheets and commodity servers overthrew the previously dominant mainframe regime. The spotlight is even brighter in today's bring-your-own-device mobile environment.
And business units have long called some of their own IT shots, from buying servers and business applications to building and deploying tactical data marts.
That's even easier now. Cloud applications and infrastructure services can be obtained with little input from IT -- sometimes with just a credit card. Ease of implementation is one thing -- the speed of cloud implementations is another, according to Noel Yuhanna, an analyst at Forrester Research in Cambridge, Mass. Outside data sources, including SaaS data created by business units and stored beyond corporate firewalls, must be made accessible more and more quickly, he told attendees at last month's TDWI BI Executive Summit in Las Vegas. As he sees it, that is one of the critical elements in future data architecture and integration planning.
Anticipate more lines of business building their own data models and maintaining their own data sets, many residing on the cloud.
Knitting SaaS data into the IT fabric
"The future of the data management platform is going to be about access across various domains in real time," Yuhanna said. What is needed, he said, is an "information fabric" that combines central IT systems with SaaS data and information from other external sources, including much of what we've come to know as big data.
Among the changes Yuhanna anticipates are more lines of business building their own data models and maintaining their own data sets, many residing in the cloud. The challenge is familiar. Data management pros need to give some rein to the departmental leaders who are close to the data and understand it well. Yet they need to fashion an enterprise data fabric generalized enough to handle widely varied data.
It's still early to guess what such architectures will look like. But the TDWI event provided a look at something along those lines, through a presentation by Bhavna Kapoor, a business intelligence architect at The Climate Corp., a San Francisco-based startup that is looking to exploit cloud architecture.
To hear Kapoor tell it, the data fabric may bear a resemblance to data virtualization architectures being promoted by Composite Software, Denodo Technologies and other vendors. Virtualization tools manage pools of data as services accessible through a dedicated software abstraction layer, and they do seem a ready fit for dealing with SaaS application data at companies like Kapoor's.
Data-driven business plans for bad weather
The Climate Corp. looks to leverage huge piles of weather data and internally developed algorithms to predict conditions and analyze related agricultural risks, as part of its efforts to sell weather insurance policies to farming businesses. The company's secret sauce is the algorithms, which are at the heart of the monthly weather simulations it runs on the Amazon S3 cloud. Kapoor said corporate executives want to be able to move quickly on IT when needed. In addition, they want the company's engineers to specialize in creating and updating those algorithms, not to become data integration experts.
More on the 2013 TDWI Las Vegas event
Check out the latest in interactive BI
The company is a few months into an initial deployment of Denodo's data virtualization software and has connected its Salesforce.com customer-relationship-management application data to a MySQL operational database via the virtualization layer, Kapoor said. The virtualization platform then feeds the combined output to a collection of homegrown and packaged data visualization tools built in-house or bought from Tableau Software and other vendors, for business analysis uses. And it's all done in the cloud: "We don't have a data center," she said. "Everything lives on the cloud or SaaS systems."
For most companies, everything will not be on the cloud going forward. But SaaS data integration is becoming so much a part of the job for data management teams that it might begin to seem that way.
Jack Vaughan is SearchDataManagement.com's news and site editor. Email him at firstname.lastname@example.org.
Follow SearchDataManagement.com on Twitter: @sDataManagement.
Dig deeper on Enterprise data architecture best practices
Jack Vaughan asks:
In your opinion, does central IT control corporate data or do the lines of business?
2 ResponsesJoin the Discussion