Data Management a Top Priority for Wall Street Firms by Wall Street & Technology.
A good article by Melanie Rodier from Wall Street & Technology magazine on Data Management and the finance industry. June 2010.
Data Management a Top Priority for Wall Street Firms
Jun 09, 2010
Data is integral to everything financial organizations do. Beginning with a business deal, transaction or trade, data is aggregated, manipulated, named, channeled into systems, renamed, processed, parsed, and analyzed by people and applications relying on its accuracy to determine the next decision.
But while data management was once considered strictly an IT issue, the credit crisis has changed the way data is viewed by financial organizations. Today it is recognized as the business driver that it is, one that is central to the enterprise and key to its survival.
“It’s no longer data management for its own sake, but it’s to serve business drivers,” says Fritz McCormick, a senior analyst with Aite Group.
Reflecting the cultural shift toward data as a business driver, organizations are hiring chief data officers and proactively reexamining their data management policies and procedures from end to end, experts say. “Businesses need to focus on cost and compliance at an individual and at a corporate level,” observes Ron Ruckh, managing partner at consulting firm LEGG Smart. “One component that flows in these directions is data — on the business side, not just on the IT side.”
Indeed, the responsibility for ensuring data quality has been transformed from an IT task to a pervasive business priority. That transformation largely has been driven by risk management and compliance demands. >>
“All that analysis about regulatory reform and systemic risk has led the world to understand the importance of data precision, complex processes and the ability to trust your models,” says Michael Atkin, managing director of the Enterprise Data Management (EDM) Council, a nonprofit trade association. “That has been a kick forward for the practice of data management.”
Financial organizations are driven by profitability, fear of doing something wrong or of damaging their reputation, and regulation, Atkin suggests. And post-crisis, regulators are pushing hard for more transparency at financial organizations. It all begins with accurate data, he says. “It might be hard to make the case [for data management] on profitability or reputational issues [alone], but regulation has emerged as the current driver,” Atkin contends.
“Failed trades, errors on client statements, erroneous performance calculations and inaccurate risk measurements, along with a host of other costly, embarrassing and potentially disastrous outcomes, result from poor data quality,” wrote Fred Cohen, group VP and global head of outsourcing in consulting firm Patni’s capital markets and investment banking practice, in a recent report on reference data management. In the current regulatory climate, he tells WS&T, when clients are evaluating their reference data management, they are first preoccupied with reducing risk rather than cost. “Previously, cost was No. 1. Now it’s No. 3, after risk and efficiency,” Cohen says.
The risk of poor data extends beyond operations to investor relations. With regulators publishing new rules on the accuracy of data on advertising materials, financial firms have been forced to pay much closer attention to both the content they do put out to the market as well as the information they don’t put out, according to Conor Smyth, SVP of global sales for data management vendor MoneyMate, which last fall signed a deal with Schrodersto provide the investment manager with a managed service to control the aggregation and cleansing of product information for presentation on its website. “Asset managers need to mitigate the risk of communicating incorrect or out-of-date information to the market,” Smyth comments.
Gerard Walsh, head of web and CRM at Schroders, commented at the time of the deal that the firm’s decision to hire MoneyMate stemmed from an understanding of the significance of the data management challenge the firm sought to address. “The focus of this initiative is to present accurate, up-to-date and error-free product information on our websites,” he noted.
Understanding Data in Context
Financial institutions generally recognize the operational importance of data precision and have been addressing data definition problems such as identification and tagging, says the EDM Council‘s Atkin. But the concept of data content management rather than data processing management is still a relatively new idea for the financial industry, he points out.
“It’s about getting your arms around what it means to manage data where precision and comparability matter as opposed to something that [merely] feeds data systems,” Atkin relates. “People understand it, but it’s still not a given for all financial organizations, even though it’s something that’s been talked about for 10 years.”
Now, however, as risk management and regulation are increasingly driving data management, and as IT budgets begin to rise again, even firms that weren’t already motivated by the operational efficiencies that can be gained from automated data processes are beginning to reinvest in data infrastructure, Atkin suggests. But as they re-evaluate their data infrastructures, he stresses, firms must start by identifying the pertinent data, including the instruments that the firms trade, securities’ prices, market entity information and macroeconomic statistics, such as weather and interest rates.
“It’s not a technology problem — it’s a communications problem between business requirements, operational requirements and technology requirements,” Atkin asserts. “You have to translate it well. More than a technology problem, it’s an alignment challenge to speak the same language to each other.”
According to Sateesh Prabakaran, chief architect, managing director, BNY Mellon, even if an organization has a mature infrastructure, data management can be a challenge that requires careful balancing of business and technology requirements. “In a large organization like ours, where we play a critical role in the global markets, data is the premium ingredient,” he says. “BNY has made the effort to focus on data quality, migration and cleansing infrastructure, including [the ability to query data and its] reliability and availability, as well as the non-still nature of data. But it has to be the right balance between business pressures and the practicalities of implementing that.”
Prabakaran points to data lineage in particular as a key challenge. “There is always the context of data as it travels through different processing nodes that sometimes transcend boundaries,” he says. “Lineage sometimes gets morphed, undated, etc. Unless you have a holistic view of how it started, of the journey it took, it would be very hard to recognize the lineage of the data.”
Answering some of the key operational questions in the business workflow requires understanding the unique nature of data housed in various data processing units, Prabakaran continues. “These may have been developed specifically to answer very direct questions based on what we knew at the time, but as things change, some questions are completely new. In order to answer these questions, we have to improvise with whatever systems and data points we know and try to be intelligent in interpreting and getting the information out of different sources,” he explains.
“Since it’s the bread and butter of different operational issues, we can’t afford guesses,” Prabakaran adds. “You have to strive for inner consistency so that the data can withstand the object of scrutiny. That’s why lineage is important in solving unanticipated needs of business.”
Equally important is to have a holistic view of data at any point in time. “We strive for dashboards that give you a piece of information within the context of cross lines of business,” Prabakaran says. “If you look at some of the dashboards flight controllers use, it’s not how you direct one flight to its destination, but how you navigate around the others. You need to have a holistic view.”
As the market emerges from the recent financial crisis, there is a growing emphasis on treating core reference data management issues as an enterprisewide priority, confirms Patni’s Cohen, referring to a reference data management survey conducted by Patni and FSO Knowledge Exchange in December 2009 and January 2010. The survey questioned 52 senior financial industry executives, 48 percent of whom were from buy-side firms and 12 percent from brokerages.
“A significant number of firms acknowledge that siloed organizational structures and practices had resulted in duplicated reference data sources,” Cohen said in the report. “With firms’ focus on risk and efficiency, it is clear that firms should follow a holistic approach to reference data management on an enterprisewide basis.”
Cohen argues that an enterprisewide data management approach helps firms save money as well as identify precisely where its data resides and who is using it. “We found that 25 percent of all reference management spend in a financial institute is wasted in multiple data silos, and in managing and conditioning the data across all these,” he tells WS&T. “Our survey found that 80 percent of people said once the data has gotten into their institutions from small vendors, only a small percentage know where it is going. They don’t know who’s using it, who’s not using it anymore. There may be data that no one is using anymore.”
Still, while breaking down silos makes sense, many organizations have found obstacles to achieving an enterprisewide view of data and in coordinating efforts across different lines of business to ensure that data is managed properly, says Sajay Sethunath, an associate partner in the technology group at Capco, a consulting, technology and transformation services provider. As a result, many firms attempt an incremental approach to data management. According to Sethunath, “A lot say, ‘Let’s look at it from a targeted initiative. Let’s bite off portions of it. Let’s put the right kind of attributes in place, and let’s aggregate on the back end.’ ”
BNY Mellon’s Prabakaran suggests that some of the challenges to achieving an enterprisewide view of data originate from the boundaries and constraints that management use to define the enterprise view. “The question is: How elastic are you in adjusting to boundaries?” he suggests. “That will be one of the most consistent problems.”
In Support of Big Governance
In the meantime, firms increasingly are focusing on data governance, which details the ownership and management of a firm’s data assets. This includes monitoring the source of the data, tracking its progress through the cleansing process, identifying and mapping data elements, tracking which applications use the data and at what frequency, and controlling who is authorized to access and manipulate it.
While a relatively new concept, data governance quickly is becoming the foundation of a firm’s data management strategy, the Patni study asserts. In some cases, according to the report, data governance drives the decisions on the infrastructure, architecture and processes required to manage and deliver data to consumers.
“Governance is definitely an issue now,” Aite’s McCormick concurs. “The market has matured along the way, and you now see very well established data groups within organizations. There are industry associations and a lot of participation in those groups.”
As data governance matures, data management technologies also are evolving. “We have gone from a typical data implementation of fancy, highly scalable in-memory databases to what we see as storage of data — also known as the appliance model — playing more of a role in high-volume analytical databases,” reports BNY Mellon’s Prabakaran.
“We’re bringing some lessons learned on server consolidation techniques and bringing them into large-scale data storage,” he continues. “It’s not new, but it’s interesting how that model seems to permeate not just large-scale analytical information stores with hundreds of terabytes, but it is also moving into the mainstream where it can be used for transaction processing.”
BNY Mellon also is looking closely at data virtualization, Prabakaran reveals. “I don’t have to have another data warehouse. By using virtualization, I will bring [the data] in whatever shape or form answers my business questions,” he relates. “That concept might work on small-scale systems, but we’re planning to see if it can work on large infrastructures.”
Data management techniques and technologies are fundamental to the creation of applications, notes Rod Johnson, general manager of the SpringSource division of VMware, and with the rise of virtualization and cloud computing, the manner in which applications need to access data is evolving. “Cloud computing is a distributed deployment model, and for that reason, caching and data accessibility are of far greater strategic importance than before,” he explains.
As regulators continue to scrutinize company’s data management practices, the industry will see more investment and innovation in that area, says the EDM Council’s Atkin. “If you put it into perspective, there have been three drivers for data management,” he says. “First, global terrorism, which caused the regulatory community to insist we validate customers, know them and follow them.” Then the Enron and WorldCom scandals forced the industry to understand the importance of entity relationships so it could understand the risk inherent in doing business with particular people, Atkin adds.
“Those first two drivers were the beginning of data activity,” he says. “But [regulation] is the big one. This will force firms to look [at data] across the organization in the same way that regulators have to look across the organization.
“This is the age of enlightenment in data management,” Atkin continues. “It’s no longer a poor, ugly stepchild of IT. It’s being recognized as an operational pillar of infrastructure.”