How technology could have prevented U.S. financial meltdown

In the coming weeks the feds and the surviving financial services institutions will have the daunting task of unraveling all the securitized loans and other instruments that are hiding the toxic investments.

But does the technology exist to do that? And if so, could it have been used to prevent the bad debt from hitting the fan in the first place?

The fact is that despite government regulations like Sarbanes-Oxley, there is little visibility mandated by current regulations into the origination of loans and how they are broken up, resold, and resold again.

To cite the classic example of how we got into this mess, consumers were given 100-percent-plus variable mortgages without any security. Not only could those mortgages be sold to other banks, but they could be divided into five, ten, or twenty tranches — financialese for slices — and resold to five to ten different organizations, making it difficult to track who was involved and who ended up taking the risk.

Theoretically, the financial service providers were clear on the risks of each type of loan and had a way to gauge whether they had enough liquidity — cash and other easily sold assets — available if the riskier loans went south.

But a New York Times report indicates that in fact many financial institutions gamed their analytics to favor positive scenarios over negative ones in order to justify keeping less money in reserves should the risky loan blow up. “A large number of buyers of these kinds of instruments really didn’t care about the value. They just wanted to flip it. A lot of people just didn’t want to know,” concurs says Josh Greenbaum, principal at Enterprise Applications Consulting.

Analytics and CEP tools could have helped

Had these financial services companies and banks established business intelligence metrics as to the ratios of what kind of debt they were holding versus the cash reserves they held, their analytics systems might have driven alerts earlier in the process, says Michael Corcoran, a product manager at the BI provider Information Builders.

But as anyone in business already knows, consolidating that kind of data to get those answers more often than not is a slow process that typically ends up being done manually in an Excel spreadsheet well after the fact.

Jeff Wooton, vice president of product strategy at Aleri, a complex event processing (CEP) company, agrees that most data consolidation takes far too long to give a complete picture. “It relies on overnight data consolidation runs, overnight reports, and manual processes like spreadsheets.”

That’s where technologies such as CEP and operational BI come into play. They analyze huge volumes of transactions — 100,000 messages per second with millisecond response time — and can set up alerts and even trigger remedial actions by other systems. Such tools, such as Aleri’s Liquidity Management System, already exist to help treasurers in global banks gauge their liquidity position in real time. Wooton says that over the last two weeks there has been considerably more interest in such products than in the past.

Wooton cautions that the various kinds of analytics tools available, such as business activity monitoring, decision support software, data integration, and alerts, could have offered a warning but not fixed the underlying problem of financial services firms misjudging — and in some cases, misrepresenting — the risks of their loans and securities.

But taking analytics, CEP, and data integration to the next level to give regulators a sense of what all the financial institutions were doing and what the liquidity risks actually were could have helped, and could prevent a recurrence, Corcoran says. He says that the use of middleware could bring the data together and create a “common front end” that is shared by regulators and the services companies alike.

But that front end needs a common back end, especially around the data that should exist, says consultant Greenbaum. “The data model for doing the analysis doesn’t exist,” he says, so a company selling securities and packaged mortgages doesn’t include the packaging history. “You can’t do the classic drill-down,” Greenbaum says, because no one knows what the relationships are because the metadata hasn’t been preserved — or at least not preserved in a way that is easy to find.

Integration is a double-edged sword

While integration of data and the analytics around financial investments could help prevent future financial meltdowns, it’s also true that the integration of business activities on a global scale helped get us into this crisis in the first place, says Suzanne Duncan, financial markets industry leader for the Institute for Business Value at IBM.

“Firm-to-firm and country-to-country integration is increasing. This improves efficiency because it lets capital flow to where it is needed, but at the same time these linkages cause larger shocks at a greater frequency,” Duncan says.

The reason comes back to the lack of visibility into financial risks and liquidity both at the individual financial institution and globally. Because the information is too scattered throughout the firm and held in silos of information systems, many financial institutions did not even have a grasp on the loans they held, he says.

“Many of these bigger firms don’t know what their counterparty exposures are. They don’t know how much Lehman Brothers owes them,” Wooton adds. (A counterparty is any organization with which your company has some kind of relationship with, be it as partner or as a client.) Duncan agrees: “No matter which part of the ecosystem you are talking about, companies don’t know what their counterparty exposures are.” IBM is in the midst of working with its Asian clients to root through what their exposures are.

Retail algorithms may be reused for financial services

Although much of the technology already exists that could have tracked and helped to at least forewarn companies of the dangers ahead, IBM is also looking at retooling some current technology that uses sophisticated algorithms to map processes to help increase the visibility into risks of financial instruments that are dispersed globally.

Up until now, technology in the financial services industry has been focused on capacity — whether an application can handle high volume and volatility — rather than on process. There is no process flow map that tells organizations who owns what pieces of what risk.

But in the retail industry, there are such flow mapping technologies to track, for example, that consumer A buys a car and then two years later sells that car to consumer B, who in turn sells his car to consumer C, while consumer A buys a new car; the software maps all of those processes for the sake of tracking buyer behavior. By scanning through the Web and physical public documents, a retailer puts together a point of view on a customer’s buying behavior. “That kind of process mapping hasn’t been unleashed to track the whole world of institutional behavior,” Duncan says. But perhaps it could.

Of course, even with the right technology in place, Greenbaum cautions, it will do no good unless meaningful government oversight is put in place so the financial services companies can’t again — through ignorance or deceit — so wildly create and distribute toxic assets.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs