.
.--.
Print this
:.--:
-
|select-------
-------------
-
Hybrid Solutions to the Systems Integration Problem

Four innovative firms have taken radically new approaches to the old problem of integrating applications and data.

By Carl Reinhardt

One of the most frustrating parts of technology management at most financial institutions is integrating numerous data sources and applications while retaining system flexibility and speed. A number of vendors are hoping to capitalize on this problem by offering hybrid solutions that marry system integration tools with generic analytics and applications.

"Financial institutions today are more concerned with choosing applications and data sources that will be compatible with their existing systems architecture than with choosing the best software or data source for each job,” says Nigel Webb, a partner in Arthur Andersen's risk management systems practice. "This is because many firms have achieved disastrous results by attempting to integrate company-wide systems and data on a large scale.”

Data warehousing projects, in particular, are notorious for cost over-runs, complex architecture and less-than-useful results. These problems can afflict data warehousing schemes that are designed to store transaction data, market data or both. Often, unclean market data and incomplete transaction data lead to incorrect analysis, poor decision-making and costly hours of reconciliation after the fact.

The pitfalls of data warehousing are of particular concern to global portfolio and risk managers who must obtain accurate, firm-wide data in order to do their jobs. "We have a data warehouse and an enterprise risk management system, but don't really use that system for real decisions,” admits one risk manager at a major financial institution. "The information is usually out of data or out of whack.”

Financial systems vendors, of course, are not oblivious to these complaints. Recently, several new products designed to ease users' systems integration burdens while enhancing flexibility for individual users have been introduced.

In the pages that follow, we profile four markedly different recent products that have a common ambition—to address these integration concerns by recombining elements of the systems puzzle in new ways. They are also eager to use thin-client technology to accelerate both implementation time and systems performance. In many cases, the new approaches have left potential users scratching their heads in confusion. Seen together, they may point the way to a new generation of systems that have greater mastery over the seemingly unsolvable integration nightmare.


FTI's The Box

A data model for the whole firm, with embedded middleware.

The Box
Contact:
Rob Flatley
Financial Technologies International
770 Broadway
New York, NY 10003
212-460-7150
e-mail: flatley@ftintl.com
Internet: www.ftintl.com

The Box Universal Financial Server from Financial Technologies International (FTI) tries to take traditional data warehousing to the next level. Financial institutions that have attempted to collect the prerequisite transaction and market data, standardize the data values, compute consolidated financial positions, and place it all in a well structured, documented, scalable database know the challenges involved.

The standard method of managing the data from the front, middle and back office is to build individual data interfaces for each data source. Once that's done, some sort of middleware is used to make sure all the data end up in the appropriate place. That's fine in theory, but it usually doesn't work well. The data models that organize data for the front office are usually incompatible with the models for middle and back. Extending a front-office model to encompass the other areas is also problematic. The result is usually an incomplete data warehouse.

The Box gets around these problems by building a more complete data model in a consistent format that makes it much easier to pump data to and from various parts of the compaplete data model in a consistent format that makes it much easier to pump data to and from various parts of the company. It also includes embedded middleware that makes sure the right data go to the right place, and sophisticated web-enabled query tools to allow everybody in the company to get at the data easily.

Multiple forms of middleware, relational database technologies and a Java-based graphical user interface combine to create what financial technology analyst Larry Tabb of the Tower Group calls a "transactional data warehouse.” "We have seen the cost of doing this through in-house development exceed many millions of dollars and take years,” notes David McDonald, a partner at Coopers & Lybrand Consulting. "The introduction of The Box has slashed cost and time, not to mention eliminating the development risk.”

The Box collects and disseminates firm-wide data in a transaction-based or market-feed format, enabling straight-through transaction processing or straight-through information creation. The goal is to support risk management as well as customer service, portfolio management, compliance and any function that requires consolidated real-time information.

The core technologies behind The Box include:

Data Mapping. Because it operates like a middleware platform with data connections mapped through it, The Box facilitates the addition of new office applications or migration to newer technologies. It comes with all the necessary mapping tools and predefined relationships to enable users to indicate and structure the interrelationships among customers, assets and the firm.

Real-Time Data Management. The Box utilizes relational database technology with real-time data management and distribution capabilities. Initially designed with Oracle in mind, it also runs comfortably on DB2, and production is already in the works for users of Sybase or the SQL Server. The Box is an open and scalable system in either a UNIX or Windows NT fault-tolerant environment.

By serving as a real-time database of current balances and exposures, The Box enables users to evaluate their positions and decisions based on dynamic data, rather than old, static data from the last processing cycle. Offering views from the enterprise down to the transaction, and every possible option between, The Box can serve the information needs of virtually any constituency.

The Global Financial Data Model. The Box is built on FTI's Global Financial Data Model, arguably the most comprehensive data model available to financial services firms. It encompasses more than 650 entities with 8,000 attributes and more than 1,700 defined relationships. The Box handles all types of financial transactions and all types of assets, from forward rate agreements, swaps and complex derivatives down to checking accounts. It is multicurrency capable and euro- and year-2000 compliant. "Because of its origins in trust operations, which span the full breadth of banking and capital asset applications, The Box's Global Financial Data Model overcomes the Balkanization of technological systems,” says Michon Schenck, director of financial services strategy at Sybase Inc.

Thin-client GUI. The Box's thin client browser interface economically provides information to internal or external clients. It is a messaging structure that allows users to drill down to any level. Users can access and query data across the entire spectrum of possibilities by asset class; by a specific asset; and by account, time and, of course, firm bases.

Likewise, users can customize which data they choose to take out of The Box. If your equity trading manager prefers data vendor A's feed, but your fixed-income manager likes B, there is no problem—The Box can address the requirements of specific departments and specific users within the firm without compromising performance.


Integral's Risk Management Framework

An object-oriented risk management system with speed-enhancing data management tools.

Integral
Contact:
Tim Westhoff
Integral Development Corp.
140 Broadway
46th Floor
New York, NY 10004
212-269-3900
e-mail: tim.westhoff@integral.com
Internet: www.integral.com

ntegral's Risk Management Framework 1.0 is a flexible risk management system that combines prepackaged calculation tools with sophisticated data management and object programming techniques. The result is a scalable software package that promises to reduce dramatically the time and cost associated with large-scale systems implementations and provides the processing speed necessary for performing portfolio-wide Monte Carlo simulation.

The Risk Management Framework was designed in response to demands for an industrial-strength risk management system that was flexible but could be implemented faster and more cheaply that most tool-kit schemes and enterprise-level risk management applications on the market today.

"To date, there have been two basic approaches to risk management systems,” explains Harpal Sandhu, president and founder of Integral. "The first is the tool kit approach, in which financial institutions purchase object code and then modify or wrap these objects to create their own systems, which could include code from both an external vendor and internal sources. The upside to this approach is flexibility. The downside is often high development costs and a lengthy implementation phase. The other approach—which may include a large, multifunctional application and a prepackaged data model—at first appears to solve some of the issues associated with building a system. But often these packaged systems' lack of flexibility leads to numerous system revisions and difficulties in blending the system into an existing environment.”

The Integral Risk Framework, he explains, creates a best-practices model for risk systems implementations by incorporating multiple data management and analytic components, optimized for large-scale risk management, which together form a software package that is both flexible and cost-effective.

The core technologies underlying the Risk Framework include the following:

Universal Data Mapper. The Universal Data Mapper is a data transformational engine that is used to extract data from multiple, disparate sources and reformat them for storage in a risk data repository or for optimal consumption by Integral's valuation and scenario generation engines. Typically, explains Sandhu, data transformation must be handled separately by a bank's IT department or pricey systems integrators who may use third-party middleware, third-party interface engines, proprietary interface programs or—most likely—a combination of all three. This process, needless to say, is both costly and time-consuming.

Integration Data Model. The Integration Data Model is designed to optimize the storage of complex financial data for optimal processing speed. Rather than simply copying existing transaction records whole, Integral's Data Model parses complex transactions into components known as deal-lets, which are streamlined for rapid and accurate valuation. The Integration Data Model also serves to reduce the volume of data that must be processed by stripping out information such as accounting codes, approval signatures and so on, which are not essential for accurate valuation (although, of course, this information may have critical importance elsewhere in the organization).

Scenario Engine. The Scenario Engine accepts historical and real-time market data feeds from numerous sources via a wide selection of prebuilt interfaces, and calculates numerous—and often complex—scenarios that users may control through a flexible configuration utility. This configuration utility provides a great deal of user control and flexibility. The Scenario Engine also decouples the scenario generation process from deal valuation. This means that scenarios can be calculated, stored and reused as necessary, thus dramatically enhancing performance speed, particularly when users are running a what-if simulation or recalculating Monte Carlo-based value-at-risk to account for a new transaction.

Valuation Engine. The Valuation Engine uses results obtained from the Data Model and the Scenario Engine to produce a distribution of results pertaining to sensitivity analysis, VAR and so on. The Valuation Engine has been optimized for large-scale risk management analysis. This means that the Valuation Engine was not built—as is the case with many risk management systems today—by wrapping a simulation engine around code originally designed for the valuation of individual deals via a front-office system. This front-office-oriented code may include date calculators and other features that are not relevant to a large-scale, portfolio-level simulation.

Results Database and reporting. The results database stores data sets representing results provided by the valuation that can be accessed via the results database. In conjunction with standard query tools (such as SQL, IQ and OLAP), users can drill through these data according to their own specifications, which may include currency, maturity and so on. The Results Database can also be configured for easy access through standard web browsers, thus enhancing the value of risk reports.

Inventure's Ranger

Integrating data without a data warehouse or a data model.

Ranger
Contact:
Stacey Pritchett
Inventure America
30 Broad St.
22nd Floor
New York, NY 10004
212-825-2341
e-mail: spritche@inven.com
Internet: www.inven.com

RANGER takes a revolutionary approach to providing global systems integration and access that does away with the physical data warehousing concept altogether in favor of an approach that Inventure chairman Michael Adam calls "distributed integration.” This approach takes advantage of cascaded servers and data caching—technologies that have figured prominently in the development of the Internet—to link heterogeneous data and applications in real time without sacrificing local innovation and flexibility.

"The Internet provides a single point of access to a vast array of information that is maintained by numerous local organizations and individuals, each having a vested interest in a highly specific subset of this information,” Adam explains. "RANGER applies the same principle to financial institutions, which are also characterized by disparate groups who care deeply about a specific set of systems and data but also require access to the whole.”

Rather than requiring the creation of a centralized data warehouse that must be fed through numerous custom interface programs, layers of middleware and so on, RANGER allows users to create a virtual database at the server level that can then be accessed by staff around the world through a secure corporate intranet. "As a result,” says Adam, "data remains under the control of those who truly care about it but accessible to everyone who needs it.”

RANGER's flexible Metadata system allows users to build—and quickly revise—crucial data relationships and interdependencies without relying on a static, relational data model. As a result, it promises to make new data available to applications regardless of their format and physical location—and without the cost and delays associated with updating a complex physical data warehouse. Local and enterprise-level reporting under the RANGER environment can thus be more responsive to new business requirements.

Because RANGER includes distributed processing technology, it is linearly scalable. This means that the RANGER environment achieves approximately the same improvement in speed as new processors are added to the system. RANGER can therefore scale up or down to meet the needs of both large and small institutions. Communications between RANGER Servers and RANGER clients are also optimized to reduce network traffic and enhance speed.

Ranger's own scripting language, RSL, is JavaScript-compatible, and the Java Virtual Machine at the heart of RANGER comes with a full set of Java interfaces and Java graphical components. For those who are less adventurous, RANGER also comes with a packaged graphical interface that gives immediate access to the infrastructure.

RANGER allows users to create a virtual database at the server level that can then be accessed by staff around the world through a secure corporate intranet.

The RANGER architecture includes the following components:

RANGER Server. The RANGER server organizes and transforms firm-wide data regardless of their physical location and format into an enterprise virtual database and provides system access through a Java-based user interface via a web-enabled network. New applications and analytics are easily distributed to users through Java applets, which are Java applications residing on an HTML page that users can download and run within a web browser. User requests are formatted as URLs for rapid transmission, and RANGER sends time series data back to the client level in a compressed, binary format designed to minimize network traffic.

The RANGER Server also includes the RANGER Metadata system, which allows users to create custom and enterprise-level data models that integrate time series data, relational data and analytics into functionally time-aware objects.

RANGER BarTender. The RANGER BarTender server is an infrastructure that collects, cleans, stores, manipulates and distributes real-time and intraday data throughout an organization, virtually removing the distinction between historic and real-time data. This feature is particularly important to financial institutions, which must seamlessly integrate time-series data with other forms of market data. (Time-series data are typically stored in a unique format to maximize speed and minimize storage space.)

RANGER Developer. The RANGER Developer is a tool kit for the development of standards-compliant applications to integrate data, functions and objects. Within the RANGER environment, these custom applications may be integrated with a firm's existing library of applications.

RANGER Researcher. The RANGER Researcher includes the RANGER Object Matrix and the RANGER Scripting Language interface to commercial and proprietary data, along with RANGER and third-party analytics. Both tools can be used to introduce and distribute new valuation models, risk management models and other analytics as well new forms of data rapidly.

RANGER Visualizer. The RANGER Visualizer is an on-line engine that allows users to visualize and manipulate data and run applications. RANGER Visualizer provides a powerful and simple Java-based GUI, query tools, a gateway to Excel and other standard desktop applications. Unique optimization permits three-dimensional graphics to be displayed on PCs. As the user's gateway into the RANGER environment, the RANGER Visualizer can be customized to meet individuals' specific analytic and data collection needs.


CastleNet's The Beast

Multiple data feeds and high tech analytics on a speedy thin-client platform.

The Beast
Contact:
Carl Carrie
CastleNet
One Seaport Plaza
21st Floor
New York, NY 10038
212-208-5000
e-mail: ccarrie@thebeast.com
Internet: www.thebeast.com

Obtaining real-time, consistent market data from multiple sources and in multiple formats, integrating these data with analytics and then serving them up to traders via a flexible GUI has heretofore been an impossible dream. CastleNet's The Beast manages to pull off this stunt admirably, sidestepping the multiple integration techniques others use to accomplish the same task.

The Beast is a scalable front-office market data delivery system that combines numerous data feeds, news reports, web-based data and both internal and external communications. It also boasts sophisticated analytics that allow users to run real-time market simulations, price single deals and build (and distribute) new analytic models on the fly. For example, users can dynamically design, modify and save their own yield curve sets, picking and choosing not only which data points along the curve to modify but also what data to utilize and which source, whether from a live data feed or from a database, to use for the curve derivation.

The Beast was originally developed for Tullet & Tokyo's foreign exchange group after a search for a scalable and flexible analytic system capable of integrating numerous real-time market data feeds in real time proved fruitless. The Beast relies on a collection of leading-edge technologies, including a Java-based user interface, a proprietary middleware package and a dynamic data model to create an on-line trading environment capable of linking traders in multiple locations into a single virtual community. This means that new analytics, deal types and proprietary information (such as a notice about a client shopping around for a particular trade) can be almost instantly distributed to a global trading group.

The Beast relies on a collection of leading-edge technologies to create an on-line trading environment capable of linking traders in multiple locations into a single virtual community.

Although The Beast can be configured as a standalone Windows NT application, it is most impressive when it is run across a web-enabled network, such as a corporate intranet, in a multiuser environment. Designed around Internet technologies, The Beast comes in two flavors—one written primarily in Visual C++ and one written in Java. Because The Beast incorporates a sophisticated, proprietary middleware package designed to optimize network resources and load balancing among multiple machines, The Beast claims it can support a nearly unlimited number of simultaneous users by simply adding more Beast application servers to the Beast network.

Components of The Beast include the following:

The TUI. The Beast allows traders to access data and analytics using a mouse or a touch screen; hunt-and-peck typing on a keyboard is unnecessary. This feature of the Beast is known as a trader user interface (TUI), which can be configured for each individual trader's viewing preferences. For example, traders can create their own pages to view one or more types of market data and calculations.

Proprietary middleware. The Beast's proprietary middleware package allows for both extreme scalability and the rapid and efficient integration and distribution of market data. According to Carl Carrie, president of CastleNet, it is easy for Beast users to add or subtract market data sources at will and for individual traders to customize the menu of market data they receive. The Beast's middleware engine also includes a load-balancing feature that maintains processing speed in case of a disaster such as a server crash by dynamically reallocating tasks to remaining servers.

Real-time data model. The Beast's real-time data model is based on a global replication structure such that changes to valuation models or yield curve constructions are reflected automatically as they occur and over the entire Beast network. This enables users regardless of location to access and view the most current scenarios and valuations, thus ensuring that all interested parties are on the same page.

--