Financial services companies better focus on becoming more speedy and agile with their data management and analytics...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
if they want to avoid a rehashing of the financial crisis of 2008, according to one Oracle Corp. executive.
Financial services firms that that embrace near real-time analytics and other data management technologies designed for agility will have a better chance of avoiding counterparty risk and, perhaps more importantly, systemic risk -- a major problem at the heart of the 2008 crisis -- according to Amir Halfon, Oracle's senior director of technology for financial services.
"A lot of it has to do with being able to aggregate data much faster across different asset classes, different lines of business and different structures of the data so that you can get a sense of what is the counterparty exposure and where is the risk," explained Halfon, who spoke earlier this month at the Enterprise Data World conference in Atlanta.
Current regulatory efforts -- such as the joint effort between private financial firms and the U.S. government to agree upon and enforce provisions of the Dodd-Frank Act -- are aimed at giving regulators and private businesses a more accurate view of risk exposures across asset classes and lines of business so they can better predict risk and systemic problems that may arise.
To better predict systemic risk organizations need to effectively manage both structured and unstructured information, such as reference data, information found in over- the-counter contracts and positions data, according to Halfon.
Financial services companies have always crunched data in an effort to avoid risk, "but the demands now are to do it much faster, to do more scenarios and to do it on demand, and to do some of the risk analytics pre-trade and not just post-trade," Halfon said. "We definitely need to move away from this notion of 'overnight.'"
Real time integration and warehousing software, in-memory computing tools, distributed data grids and business intelligence (BI) and analytics tools are some examples of complementary technologies that work together to help organizations do a better job of being speedy, agile and avoiding risk, according to Halfon.
"We hear the word 'agile' a lot when it comes to development," he said. "But I think the notion of agile analytics and calculation is something that we're starting to see a lot of as well."
More data management tips for financial services firms
The Dodd-Frank Act could mean a data management mess for some
State Street chief scientist sees a big future in semantic databases
Poor reference data management causing headaches in financial sector
Remember to focus on data quality
Conference attendee Andrey Pyshkin, a managing partner with RHConsulting, a Russian firm that helps financial services firms get their data management houses in order, added that a strong focus on data quality and data governance should underpin any plan to become more agile and avoid a financial meltdown.
After all, Pyshkin explained, it is exceedingly difficult -- if not impossible-- to calculate risk exposure and predict the future when dealing with unreliable data.
"We are trying to offer our clients more of a focus on data quality," he said. "It is a typical issue for an organization today to have a lot of information, but they cannot enforce the rules to keep this information consistent."
Technologies that help fulfill the need for speed
Halfon spent much of the session talking about technologies that help financial services organizations become more speedy and agile when it comes to processing data and reacting to potentially risky situations.
Some of the technologies discussed included "pre-engineered machines," such as data warehouse appliances, data grids, R (the open source programming language for statistical computing and analytics), and the Hadoop Distributed File System and related open source tools.
The key to dealing with growing volumes of data is parallelization, Halfon said, and that often involves building out large grids of computers. He added, however, that many programmers still find it highly challenging to write parallel code.
Find out what else happened at the Enterprise Data World conference
Get tips from Enterprise Data World attendee about how to do data quality on the cheap
Find out why "Datachick" is going off on big data technologies
Get Bank of America's tips on how to launch a successful enterprise information management (EIM) program
"The big challenge today is how to parallelize data management," he said. "Because some of those grids are starting to choke on data access where the compute is working fine but you can't get the data fast enough through the network into where the computing is being done."
Oracle is starting to see its customers place more emphasis on "data parallelization" and the idea of "moving the compute to the data and not the other way around." One of the technologies that can help financial organizations accomplish that goal is data grid technology.
Halfon pointed out that members of the developer community find data grid distributed computing technologies to be very similar to the Hadoop Distributed File System and related technologies. The main differences between the two technologies is that Hadoop does its computing via a file system and data grid technologies do all of the computing in-memory.
One example of how data grid technology can make financial services firms more agile and responsive are organizations that want to aggregate their positions across different lines of business and geographic regions. Using a data grid, a company with offices in New York and Hong Kong will be able to make sure that everyone has access to the same up-to-date information and can react accordingly when issues arise.
"Using a data grid, they'll have a replicated set of data they can work against that is then synchronized over the [wide area network] and of course, that works with pre-engineered machines as well," Halfon said. "We have machines that are pre-engineered for data grids. Ours are called Exalogic. But I promised I wouldn't talk too much about our own technology."