Tuesday, April 29, 2008

The R-Word Chronicles, Vol. *********

All:

There comes a point when a recession just starts to drag. You’ve long since absorbed the shock. You’re tired of being reminded of it all. You dread the flatness of every new day. You’re more than ready to flatline the whole sorry experience and move on.

We’re barely into our current recession and I didn't realize how sick I've become of it till this little bit of beauty popped into my browser: Was Data Integration a Factor in the Subprime Mortgage Crisis? In it, Tony Fisher of SAS/DataFlux tries, not very successfully, to blame the subprime crisis on lenders’ failure to load up on the magic elixirs of data quality and master data management--which, conveniently, are exactly what he’s selling.

Judge for yourself:

  • “Fisher: It is absolutely true that better data management, data quality, data integration practices could have help identify where things were headed.”
  • “Fisher: What happens today is that you can go online, fill out a couple of forms, click a few buttons and wait a few seconds, and all of a sudden it will come back and say that you’ve been approved. But today, that same type of scrutiny wasn’t (going on) leading up to the crisis. … one of the many reasons falls squarely on the shoulders of the lenders, because they were getting information from potential mortgage buyers, and they were not scrutinizing that data appropriately, not insuring that the data was correct, and that led to an inflated amount of risk the lenders were undertaking. So was the data there, could the data have been used? Absolutely, but it couldn’t be just the data. You had to put the data and the process practices in place together.”

Hey, wait a moment. That’s not a data quality issue, in the sense that it would be possible to fix it with data profiling, matching, merging, and standardization. Rather, it’s a loan quality issue, exacerbated by lenders’ failure to vet the acquired data against other sources prior to approving the bad loans. On their part, it represented a knowing, willful disregard for the risks associated with subprime loans, which were motivated by pure greed. Well, Fisher tries to have it both ways:

  • “Fisher: So, at the end of the day, a lot of the subprime mortgage crisis really did have to do with the lack of risk assessment. And risk assessment is something your data will indeed bring forward. All the information is in the data, but you have to use the data.”

Let’s get real here. My sense is the subprime lenders did indeed use the (junk) data to justify their (junk) lending decisions with full knowledge of the risks, which they, apparently, considered acceptable until it was too damn late. No amount of technology will keep you from shooting yourself in the foot if you’re so inclined. No matter how trustworthy your data, if it’s in the hands of untrustworthy decision makers, you're screwed.

Yes, I cover business intelligence, decision support, data quality, master data management, and governance risk compliance systems for a living. But I'm not a kool-aid drinker. I don’t imagine that these technologies automatically produce intelligent decisions for managing risk.

So, at the end of the day, people make stupid decisions, in unison, all throughout the economy, all the while thinking that what they’re doing is acceptable by prevailing standards. And they're quick to point their fingers elsewhere--such as at other people's failure to use some magical technological cure-all--to explain why the economy is suddenly starting to tank.

Hence, recessions. Hence the fact that we never learn.

Jim