Friday, November 16, 2007

ink-impressed tek-biz bound-bulk-pulp review “Competing on Analytics” by Thomas H. Davenport and Jeanne G. Harris

All:

Winning is a “science” now, or so says the subtitle of this new book. Funny, I thought winning was an art—or, rather, a result to be sought through art, science, dumb luck, karma, magic, good genes, treachery, God’s grace, or what have you.

Regardless, winning is adaptive success, and adaptation through natural/competitive (and/or engineered) selection is what drives evolution, and there is some science (i.e., a systematic, fact-based, collaborative inquiry into basic principles, descriptive and predictive) behind our belief that evolution is how life in all its crazy heterogeneity continues to cultivate God’s green Earth, so I’ll grant them this word/concept in this context.

Actually, let me take this opportunity to spell out my core definition of “science,” and then map it into Davenport/Harris’ discussion of how analytics supports a science-like approach under which humans manage to tighten and hopefully brighten our stewardship over this planetary inheritance.

I actually addressed this matter indirectly on July 12 of this year, in this blog, under the seemingly endless (though only two month) “Ocean Semantic” thread. Buried in an extremely long shapeless run-on paragraph near the end of that thread, and couched in the context of a gratuitously erudite observation on Kant’s metaphysics, here’s how I defined “science”: a “process of progressive societal construction of an interlinking system of empirically verifiable statements through the building and testing of interpretive frameworks via controlled observation.”

The key concept here is “controlled observation,” and, in particular, the notion of appropriate controls on (empirical) observations. Pretty much everybody agrees that the key controls on scientific investigations--in order to “build and test interpretative frameworks,” i.e., construct and confirm hypotheses—should be some combination of analytical, logical, mathematical, statistical, experimental, demonstration/replication, independent verification, peer review, and other methods, procedures, checkpoints, and so forth. Some controls are more appropriate and feasible for some branches of scientific investigation than in others (e.g., you can do controlled, laboratory, experimental verification in organic chemistry more readily than in astrophysics). Such fact-based controls are designed to drive the decision to confirm or not confirm hypotheses, or disprove, qualify, or constrain established theorems.

Getting now to “Competing on Analytics: The New Science of Winning,” Davenport/Harris define their core concept, “analytics,” as referring to “extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions.” Clearly, what they’re describing is essentially an application of scientific practices to practical matters: solving business problems. That’s cool…science done in business suits is just as valid as in lab coats….and maybe more useful where it truly counts: creating sustainable value, generating wealth, and contributing to human happiness in some small way.

The book is an excellent discussion of how enterprises can compete through smart application of statistical analysis, predictive modeling, data/text mining, simulation, business intelligence (BI), corporate performance management (CPM), online analytical processing (OLAP), data warehousing, data cleansing, expert system, rules engine, interactive visualization, spreadsheets, and other applications and tools that once, in the prehistoric days before I entered the industry in the mid-80s, were often lumped under the heading of “decision support systems” (DSS). It’s no surprise that I received the book as a freebie for attending a recent conference sponsored by SAS Institute, which was not only a pioneering vendor in DSS starting in the mid-70s, but of course remains a powerhouse in BI, CPM, data mining, statistical analysis, predictive modeling, visualization, and many of the other DSS-ish technologies I just enumerated (thanks SAS!). The book is chock full of excellent case studies of companies in many industries that have differentiated themselves, notched impressive ROI, and competed effectively through DSS-ish analytics technologies—and also by cultivating analytics-driven cultures that are spearheaded by CEOs who got analytics religion.

Analytics, analysis, and analysts truly rule…that’s for sure…I’m an analyst, so of course this resonates…and this book is a very handy set of guidelines for organizations that want to leverage their BI and other analytics investments into sustainable competitive advantage. For purely personal reasons, one of the things I noticed while reading this book is that Davenport/Harris twice give kudos to Peter G.W. Keen, who in the mid-70s, as an academic, helped pioneer/popularize the concept of DSS. The reason I say “personal” is because Peter G.W. Keen, in the mid-80s, as president of the short-lived MCI-funded DC-based quasi-analyst-firm International Center for Information Technologies, hired James Kobielus as a research associate…an experience that lead to, among other things, my still-going stint as a contributing editor/pundit for Network World (though it actually wasn’t my first “analyst” job….that was actually an internship in the summer of 1979, between my junior and senior years in college, at an urban coalition, New Detroit Inc., as a policy analyst, trying to help that city, near which I grew up, recover and rebuild from its sad decline…but I digress). Closing the loop on Keen, when I first picked up Davenport/Harris’ book (but before opening the cover), I thought to myself: “hmmm…’Competing on Analytics’….somehow, it reminds of the title of Keen’s ‘Competing in Time’ book, which was published during my ICIT stint….hmmm….”

Anyway, one of many things I like about Davenport/Harris’ book is their nuanced discussion of the proper roles of analytics vs. intuition in business decisions, and of the roles of automated analytic tools vs. human analysts (on the latter….whew, I thought….at least they recognize an ongoing role for the likes of me and my kind….maybe we don’t have to surrender our wetware completely to the gratisphere just yet…John Henry was a model-hammerin’ man…..). My favorite excerpt (pp. 131-132): “A few years ago, we began hearing extravagant tales of software that would eliminate the need for human analysts….While data mining software is a wonderful thing, a smart human still needs to interpret the patterns that are identified, decide which patterns merit validation or subsequent confirmation, and translate new recommendations for action. Other smart humans need to actually take action.”

Another key take-away for me from this book is that professional analysts—i.e., predictive model builders, who power those analytical engines models with structured data, deep domain expertise, and statistical algorithms—can only accomplish so much if the organizations that employ them are captive to bad business models. From page 55: “[One of the things that has] kept [American and United Airlines] from succeeding with their analytical strategies….is that their analytics support an obsolete business model. They pioneered analytics for yield management, but other airlines with lower costs can still offer lower prices (on average, if not for a particular seat). They pioneered analytics for complex optimization of routes with many different airplane types, but competitors such as Southwest save both money and complexity by using only one type of plane. They pioneered loyalty programs and promotions based on data analysis, but their customer service is so indifferent that loyalty to these airlines is difficult for frequent flyers.”

In other words, to extend the airline metaphor, the human analysts are like the navigators in the cockpit. They are totally on top of every data point surfaced through radar, instrumentation, etc. But they are essentially captive to the decisions made by the genius sitting in the pilot’s seat.

Somehow, my mind goes back to the movie “Airplane,” when, after pilot Peter Graves was felled by food poisoning, flight attendant Julie Hagerty got on the intercom and asked the passengers: “Excuse me, there’s no cause for alarm, but does anybody back there know how to fly an airplane?”

Jim