Making XML Work
By Nina Mehta
Getting all your data into one fishbowl won't be as easy as the optimists predict.
Deep in the night, and miles from the office, risk managers dream of a world in which data is collected from a hundred different trading desks and a dozen market data vendors and consolidated effortlessly into a single global risk management system, with risk reports and three-dimensional graphs that make everything crystal clear.
Then they wake up.
The dirty little secret is that getting consolidated risk numbers across an enterprise is a grisly—if not impossible—task. It requires extensive mapping between disparate derivatives trading systems, risk engines, third-party feeds, databases and servers—all roped together by in-house IT departments and pricy gangs of implementation specialists. As new instruments are traded, more middleware is added to the slag heap of code that hauls data from one place to another.
So along comes eXtensible Markup Language, supposedly making data mobile and portable by standardizing it and rendering it platform-independent. It obviates the need for middleware by making XMLized data in any application readable by XML-formatted systems. If the data arriving from different source systems and feeds are like so many species of finicky, incompatible fish, the great promise of XML is that it will get the various data to swim together, play together and interoperate as tidily as a school of minnows.
Like many fish stories, however, this is a bit of a tall tale. It's not a lie and XML is no less the great promise its advocates describe, but implementing XML in large financial institutions will be a troublesome and messy task. Legacy data won't give up their informational goods without a lot of costly effort and time, and there are a number of broader standards-related and derivatives-processsing issues that could well stall the optimistic, headlong rush toward more efficient data husbandry.
For XML to provide real benefits to the financial services industry, standards need to evolve. XML is essentially a grammar for describing and exchanging messages; to be useful, that grammar must be accepted by those describing and exchanging documents or messages in the relevant markets. A few XML grammars for derivatives already exist, including FpML (developed by JP Morgan and PricewaterhouseCoopers), FinXML (from Integral, a vendor in Palo Alto, Calif.) and Network Trade Model (from SunGard Trading and Risk Systems), each with different instrument coverage and levels of specificity. FpML has gained more support over the last few months since it's being developed by a lugubriously thorough industry consortium rather than a vendor, but one drawback is that it's currently limited to a dozen or so fixed-income derivatives instruments.
For FpML or any other protocol to gain liquidity, it has to be used for trading. This isn't currently happening, although FpML is starting to be used internally by a few banks for writing and communicating the terms of interest rate swaps. Pro-XML forces say a standard will make the slow, telephone-centric world of over-the-counter swaps an electronic marketplace, but this will require a level of discipline and consensus that has—so far—eluded the derivatives industry.
|"People have to sit there and go through the mapping project...to do an in and out message for xml, it probably took us two programmers and one analyst six months.”
— Tom Gambino
Consequently, a number of rigors will have to be observed for XML to stay off the shoals of data-integration chaos. XML specifications will have to be scrupulously vetted and published by the bodies that create them, notes Richard Morris, a senior developer at RiskMetrics, which is rolling out a number of products with XML as a key component of their architecture. Banks and institutions will need to adhere religiously to the published public standards or be responsible for creating and publishing their own standards. They will also have to identify the XML format and version they use for individual asset classes and applications, so standard interfaces will be able to translate between XML formats in different vertical markets.
The development of new, untested XML specifications also presents challenges for vendors. Some are unsure when they should adapt to an XML framework, and which published formats should be supported. Moreover, because standards are only just emerging, and radical differences may exist between one version of an XML grammar and another, vendors could find it initially tricky to make their software data structure fully compatible with the standards, points out Yaacov Mutnikas, vice president of software development at Algorithmics, a Toronto-based derivatives risk management software vendor. But this will eventually change, he thinks, "since financial institutions will demand a high level of integration in their processing chain, in order to address challenges such as next-day settlement.”
These issues will take shape as standards develop, but one that already has all the specificity of a pulverizing technological headache is getting legacy data into an XML-compliant format so it can be queried and integrated with front-office position information.
Mapping to XML won't be a cakewalk. Sanjay Mithal, vice president of financial products and services at eCredit.com, calls XML a language that permits a "handshake protocol” between two network applications. And this is true—interactions between systems with application programming interfaces built to particular XML interface specifications will make communication and eventually on-line transactions easier. But when data in legacy systems are involved, implementing XML comes down to a lot of people putting a lot of shoulders to a muddy and seemingly intractable wheel.
The main problem at the moment is the lack of interfaces. Tom Gambino, vice president in technology development at SunGard Trading and Risk Systems, cuts right to the chase. "It's the job of everybody building and feeding XML messages to make sure that what I'm getting is common across systems and platforms,” he says. "You can't downplay this—the big thing is that nobody's got an interface. People have to sit there and go through the mapping project.” Whether technologists are mapping to a proprietary system or to XML isn't egregiously different in terms of effort and time—although tools are now coming to market to simplify this task. XML does offer greater transparency than middleware because it's self-documenting and because another person can pick up where a systems person who just resigned left off, points out Gambino; XML can also be leveraged since existing interfaces will eventually be able to feed SWIFT confirmations, payment systems and so on. Nonetheless, if an institution is shlepping data in and out of systems, those data must still make that costume change to XML.
So how long does the mapping take? NTM built a protocol for about 15 standard interest rate derivatives such as swaps, caps, floors, variable-rate loans, fixed-rate loans and so on. "To do an in-and-out message for XML,” says Gambino, "it probably took us two programmers and one analyst six months.” That's no mean task. Multiply it by the many hundreds of products traded at some banks, and the chore takes on mythic proportions. On the other hand, points out Mike DeAddio, chief technology officer at Cygnifi, an over-the-counter derivatives risk management application server provider, a large global bank could easily have 1,000 people sitting in its dungeons writing code. In the end, therefore, building to a specification that can be used and reused and that can interoperate with a slew of applications could be a cost savings.
Mapping legacy and reference data to XML, however, can be hairier than anticipated. Last year, ABN Amro decided to build an enterprise data warehouse for its treasury portfolio, encompassing foreign exchange spot and forward, options, money-market products, various fixed-income instruments and their associated derivatives—all booked and risk-managed in a range of legacy systems, says Michael Ong, former senior vice president and head of the bank's enterprise risk management function in Chicago, who will be chief risk officer in Credit Agricole's New York office starting next month. ABN Amro at first sought out a middleware solution, but switched to XML when the weeping of systems people got too loud. "That's the short story of a beautiful implementation,” summarizes Ong. The long story was mapping the data from the legacy systems into XML.
|Implementing an XML strategy, like implementing any other large-scale technology decision, is a complex undertaking. The ratio of the complexity between having a plan and doing the actual work of data mapping out of legacy systems is probably 3 to 1, argues Yaacov Mutnikas, vice president of software development at Algorithmics, since the former involves the far stickier task of getting people, rather than data, to agree on a range of issues.
Tom Gambino, vice president in technology development at SunGard Trading and Risk Systems, agrees that implementing XML messaging across databases and systems is primarily a matter of will. "The data-aggregation technology, quite honestly, was there with store procedures and version-safe flat files,” he points out. The threshold effort is lower with XML because the growth of the Internet established the case for formatting standards (HTML, the coding that allows web pages to look the way they do, is a subset of XML), and because technologists are more interested in XML since its platform-independent messaging protocol can be leveraged. Yet the main difference, he insists, is "will feeding the technology and getting to a philosophy in a bank that there is going to be centralized risk management and that it's not going to be left to the individual desk.”
Juan Lando, senior manager of risk systems at Arthur Andersen, adds that decision-making at senior levels continues to be a vital issue in risk management technology implementations. Getting people to see risk management not as a cost center but as a benefit—and not merely for the sake of capital adequacy—is still like pulling teeth. "A lot of trading areas are fiefdoms,” he says, "and getting some person or group of individuals to abide by a global rule like how or when they're going to send their risk data to some location is probably the hardest part of the process.” Once business concerns make data-aggregation or an XML implementation a priority, however, the commitment to the technology can fall much more easily into place.
Making the data XML-compliant proved an ordeal. XML easily lives up to its acronym, says Ong, since it's extensible. Unlike most hard-coded middleware solutions, new products can be added to a subportfolio of straight Treasuries, for instance, without much struggle. But the entire process required hundreds of man-hours of mapping and a tremendous amount of data normalization—fields and identification numbers and reference data had to be scrubbed and matched for the data to be consistently structured. One sign of just how mixed a blessing XML can be, at least in its initial stages, is that for every benefit Ong rattles off—such as the language's flexibility over vendor-supplied middleware and the ease of implementing web-based applications—he immediately backtracks. "XML permits ease of data extraction,” he notes. "It allows you to more easily, more readily extract the information you need from databases—I need to keep qualifying what I say,” he groans, remembering.
Although XML has many cheerleaders, not everybody is sure whether it will truly work or be worth the effort. Lehman Brothers is currently facing a large technology choice and is unsure about whether the benefits of XML can be directly quantified. The bank is considering moving to an XML-based messaging format as it ponders whether to shift from a central transactions database for risk management in New York to a more distributed plan.
Like most other banks, Lehman FTPs files of position data between different systems, then goes through a time-consuming normalization process to make sure the information arriving from various front-office systems is consistent. Wayne Kunow, senior vice president in risk management technology, notes that it's difficult to predict whether XML would help or hurt the bank in getting that process done more efficiently. "We have a process in place,” he says, "so it's not like we have to develop something from scratch. Risk management here is basically a homegrown solution, and as part of the normalization process we are already doing the mapping we need to do.”
Across the industry, there is a good deal of conflicted feeling, if not skepticism, about the efficiency of converting data into an XML format. Some of the hesitation results from the utter dependence of XML on the development of industry-accepted standards, and some of it may rest on perceived conflicts with other priorities such as the increasing desire to push the risk management process as close as possible to the front desk and then to combine risk numbers upward from there.
There are also some who question the need for data integration. Amir Khwaja, CEO of Kronos, a London-based software company, argues that the whole data-integration approach to risk is a bit off the mark. Instead of companies like SunGard Trading and Risk Systems and Algorithmics aggregating a bank's positions in a data warehouse and then calculating value-at-risk numbers and other metrics with their own middle-office analytics (which are not liable, he says, to hold a candle to the numbers crunched in a front-office system from, say, Summit or Murex), he suggests something else. Using XML and Kronos Risk's servers, banks can calculate risk analyses through adaptors built to the APIs of the underlying front-office source systems, thereby eliminating the need to worry about whether the middle office is keeping up with what's going on at the bank's trading desks.
|The XML Dream
|Although XML is only just developing, it's primed to redefine business in the financial services industry.
First and foremost, it will do this by enabling the webbification of applications. "The business-to-business model, commerce on the Internet and ASPs needed a common data format to move things around,” says John Goeller, a director at Salomon Smith Barney—and XML stepped into the breach. "It's a bit of a self-building, bootstrapping operation,” adds Mike DeAddio, chief technology officer at Cygnifi. "Another methodology could have worked, but this one has industry support and the tool sets, and is now being used pretty widely—and the more it's used, the more valuable it becomes.”
Since XML is platform-independent and scalable, new business models now make sense. An XML world will ultimately make mapping costs wither away and a lot of heavy technology unnecessary for web-based applications. New products and asset classes can move on-line once standards appear, and trading is likely to become speedier and more efficient. Ease of data integration will also allow banks to use the data soon to be at their fingertips for better customer relationship management and to pursue new business more nimbly and effectively.
Integrating legacy data with other sources of data will also get easier with time. Extensible Stylesheet Language-based tools and processors, which enable XML templates to translate data from one XML grammar to another, are already being rolled out by vendors such as IBM and Microsoft. Templates for the financial services industry will appear as standards emerge, and will be supported by off-the-shelf tools or available for free from derivatives and risk management software vendors, notes Goeller.
Another big perk is that XML will liberate data, allowing it to be "repurposed” and used in different formats. XML uncouples the data in a message or document from the formatting of the data. This allows the same data to be displayed in multiple ways, such as on an HTML page on a web browser or on a wireless device, adds Goeller. Eventually, users will also be able to extract the specific information they want from, say, a dozen brokerage firm reports and integrate it in one place that's more convenient.
A related advantage of XML's open standard is that once proprietary protocols are designed, their development could be outsourced to third parties, says Sanjay Mithal, vice president of financial products and services at eCredit.com, potentially shrinking a firm's IT staffing needs.
When it comes to risk management, yet another benefit of XML's open sourcing is that it will be a snap for banks and financial institutions to make changes to their technology platforms. "If a bank wants to migrate from one front end to another and they both support XML, it can simply move the data from System A to System B,” says Pamela Trendell, product manager for SunGard's Panorama MasterFiles. "Before, you'd have to hand-enter all the trades or write a program to translate them from one format to the other.”
STP and real-time VAR
Not surprisingly, XML is also hyped as a balm for other risk management woes. Because of its inherent standardization of formats, many people argue that it will trim operational risk and boost straight-through processing. The benefits of STP won't be reaped anytime soon, however, since the front desk is still by and large cut off from the back-office process, but operational risk is likely to be reduced—although not always where expected.
Data, it's worth noting, does not become clean or normalized simply by being filtered through the alembic of a protocol. An XML messaging protocol may get the data into a database more easily, but the data are still going to need manual palpating. An example of this would be the need to determine if a field called "maturity” in System A and one called "tenor” for a similar product in System B are the same thing. And does "three months” in some other system mean 90 days or 91 days? If the data are structured differently in different applications, those bumps must be smoothed. "You've got to get the data normalization and improvement process moving along at the same time that you're facilitating an improved technology transfer of data,” says James Gertie, director of global capital markets and risk analysis at BankBoston.
And there are other data problems that XML won't solve. A beastly issue that weighs on many a risk manager is market data. The data coming in from one or another third-party vendor often have gaps, off-market prices and transcription errors, all of which must be corrected—and XML can't help with this. There are also other, more technical kinds of operational risk-related problems. Gertie recounts a recent case in which a feed went into a BankBoston system, changed a desk's location and the desk disappeared. "It didn't disappear,” he corrects himself. "It was in the warehouse, but you couldn't see it because it was assigned to Timbuktu, in the Netherlands.” An XML-based interface could possibly have averted this sort of snag.
One area that's a breeding ground for operational risk, and that could see an improvement as XML slides into action, is static and reference data. If XML becomes a messaging and formatting standard, "data can be entered once and published out to the various engines that need it,” says Pamela Trendell, product manager for SunGard's Panorama MasterFiles. It's a gargantuan task of mapping, but "there will be synchronization and single-entry, instead of multiple-entry, XML will reduce the costs of maintaining the data, and error rates will drop.” Another, perhaps more indirect, way in which XML could help lower operational risk is by enabling best-of-breed applications in the front office and middle office. By allowing systems to be integrated more easily, XML would permit risk managers at different levels to use the systems that best suit them and to slice and dice risk more precisely.
|"Trading areas are fiefdoms, and getting people to abide by a global rule like how or when they're going to send their risk data is the hardest part.”
— Juan Lando
Real-time risk management is another prize rashly associated with the development of XML standards. Enabling data to move around efficiently, with a minimum of missteps, can sometimes make XML seem like a springboard to real-time risk measurement. Getting data aggregated, after all, is a key step in risk analysis. But while some in the derivatives industry chirrup about real-time VAR numbers, that's unlikely to happen anytime soon. It's not a bandwidth issue as some say, argues Cygnifi's DeAddio, but a function of the processing time for derivatives. "Over-the-counter derivatives can't be priced in zero time,” he points out. "You have to apply significant hardware and analytics, which means that if you give me 1,000 swaps and it's an average of 100 milliseconds to price one and get the risk for it, all of a sudden it takes a couple minutes. And the more complicated you get in terms of the analytics, the longer it takes. So there is no real time in the derivatives world.”
Nor is there a compelling need for it, argues Steve Allen, managing director for derivatives risk management at Chase. Having real-time or intraday VAR numbers "implies first of all that you have real-time position information available on a computer, which I think most firms do not have,” he says. More significantly, he adds, "I don't see what the value of it would be from a managerial standpoint.” It may be vital to have a good idea of the firm's real-time net equity positions or the total vega of a position, but no concrete benefit would come from knowing, as he puts it, the "total running VAR on an intraday basis.” There are other important issues a bank can spend time addressing, such as whether it has the right pricing of positions based on what's happening in the market and whether its stress-tests present an adequate representation of what could happen to prices in the near future.
No matter how sophisticated risk management becomes, however, it's unlikely to veer too far from the business of data, since risk engines live and breathe off position and market data. Integrating and sharing data with many legacy systems will therefore remain a crucial task for enterprise risk management. Even with XML, it may not be a "walk in the park” to aggregate positions and data seamlessly from 20 or 30 systems in the very near future, says Algorithmic's Mutnikas, "but certain standard data such as terms and conditions for high-volume, lower-complexity instruments will eventually be integrated with ‘plug-and-play' interfaces to the risk management systems.”
Not everyone would consider this more than an even bet, but given the general industry support for XML and the ravishing hopes pinned to the idea of a messaging and formatting data standard, it's probably safe to say that the amen corner in the Church of XML is likely to grow.