Friday, May 30, 2008

Relations with Analysts...the second

All:

Second Q posed to the Forrester AR Council panel on how to relate to the blogosphere, followed by my A:
  • Q: Does the manner in which the AR professional deals with blogging change with the size of the organization, e.g., is it harder for AR at larger firms to anticipate and address the myriad of issues coming at them from blogging pundits? Are smaller, more agile firms at an advantage?
  • A: Hard to say. If you’re a bigger, more diversified, more dynamic vendor you’re likely to elicit more commentary from more external parties through more channels on more issues more of the time. But you’re also likely to have more of your own people reading--and anticipating--all of this, and preparing/spinning suitable responses. But you’re also more likely, if you’re big, to have more trouble coordinating internally among all stakeholders in order to prepare a concerted response. But, conversely, if you empower more of your people to post replies/counter-attacks through their own blogs, or through your company’s blogs, you can defuse the issues more rapidly. Or, if you’re not careful, light more fuses. And give the appearance that your right hand doesn’t know what your left hand is doing. Gee I wish there were easy answers.
Jim

Thursday, May 29, 2008

Relations with Analysts...the first

All:

Whew...quite a string of travels...not through it all yet. In the past month, I’ve been to TIBCO’s TUCON (San Francisco) and SAP’s SAPPHIRE (Orlando), plus a quick IT vendor consult...next week, I do Informatica (Vegas), then the following week Microsoft (Orlando)

Last week, I was at Forrester's IT Forum 2008 in Vegas, where, among other things, I participated in a panel session on blogging, focusing on how analyst relations (AR) professionals should relate to “influencers” in the blogosphere.

Organized by Forrester’s Analyst Relations Council and moderated by Forrester VP Laura Ramos, the panel brought together leading IT industry analyst/bloggers plus those who blog-about-analyst/bloggers: Carter Lusher, president Sage Circle; Dana Gardner, principal analyst, Interarbor Solutions; Bill Hopkins, founder & CEO, Knowledge Capital Group; and Jonathan Eunice, founder and principal consultant, Illuminata. Oh, and a “token” Forrester analyst who’s been kicking around the blogosphere for a few years, including, increasingly, under our information and knowledge management blog (in case you’re wondering why the rate of postings to my personal blog has dropped in the past few months--still searching for the right rhythm and balance and partitioning of the jim-o-spheres, left and right, between the two).

Last week’s Forrester AR Council panel was well-attended, and the questions from council members were excellent. My fellow panelists were everything we could have hoped: smart, informed, opinionated, articulate, provocative. I’ll leave it up to them, in their respective blogs, to repeat what they put forth.

Here now, is the first question that was posed to us, plus generally how Kobielus responded:
  • Q: How do AR professionals stay on top of bloggers and determine who to interact with and “influence” and who to ignore?
  • A: Simply ask yourself who you read, who your colleagues read, your clients read--whose pieces you/all forward--whose you/all link to--whose ideas stick in your minds--whose names, reputations, and methodologies resonate with everybody in your immediate work environment and/or industry. Those are the indicators of “influence.” To the extent that analyst exerts such influence purely through one channel--blogging--all power to them. But the best analysts have always availed themselves of all channels at their disposal to inject their ideas into the bloodstream of the industry. Chances are that the chief “bloggers” are established analysts who have simply reinforced their brand through this medium. If they’ve made blogging the core of their for-pay business model, cool (and please explain how). Most of us analysts use blogging in various and sundry funky ways to supplement/promote our for-pay gigs.
By the way, this goes without saying (or does it?), this is not the official Forrester position on all of this (we're working through these issues, just as every other analyst firm is, as we go, in the context of our own evolving business model). As I mentioned, I was simply one Forrester analyst asked to share his thoughts.

Jim

Saturday, May 03, 2008

BI craves cheap horsepower

All:

Analytic databases are the principal engines driving business intelligence (BI), delivering operational data into reports, dashboards, and ad-hoc queries.

Essential as they may be, analytic databases have been largely overlooked in the BI industry’s recent consolidation spree. Sitting at the core of data warehouses (DWs) everywhere, these data stores have been treated as mere plumbing rather than as differentiating platform components. Instead, most recent BI mergers have been driven by vendors’ desire to beef up their financial analytic applications, or add more sophisticated visualization, search, and other access-oriented features to their BI platforms.

Though often taken for granted, analytic databases will almost certainly become a key BI solution differentiator over the next several years. With the trend toward commoditization of core BI features, more vendors will distinguish their offerings through the speed, scalability, throughput, and mixed-workload support that only a well-tuned analytic database can provide. Every self-respecting BI vendor will boast that their analytic database can handle more concurrent users, process more complex multidimensional queries, load bulk data more rapidly, execute more compute-intensive transforms, and manage more massive data sets than the competition. Just as important, they’ll brag that they can do all this more cheaply than the next guy.

In an increasingly commoditized BI market, analytic price-performance is becoming the principal buying criterion. This trend is fueling the industry’s growing focus on analytic appliances, which are also called BI appliances or data warehousing (DW) appliances. Indeed, most of the leading BI vendors--SAP/Business Objects, IBM/Cognos, Oracle, Microsoft, and SAS Institute--provide their own analytic appliances now, or are developing appliance-based offerings on their own or with partners. Though these vendors will continue to deliver BI/DW solutions as packaged software offerings, they all see the appeal of appliances as turnkey solutions for many customer requirements. Midmarket customers, in particular, are taking a keen interest in appliances, which provide them with quick-deployment pre-optimized solutions and thereby relieve the burden on their limited technical staffs.

As analytic appliances become central to enterprises’ BI strategies, DW appliances will evolve into full-fledged BI platforms in their own right. Appliance vendors such as Teradata, HP, Netezza, Greenplum, DATAllegro, Dataupia, and ParAccel will expand their ability to run “in-database analytics” and other applications developed in-house, or by partners and customers. Appliance vendors will outdo each other in tuning database features--such as indexing, partitioning, in-memory caching, compression, cubing, tokenization, and query-plan optimization--that are geared for managing myriad analytic workloads. And every appliance vendor will beef up their hardware’s scalability through massively parallel processing, clustering, workload management, and other ongoing enhancements.

In addition, every vendor of column-oriented databases--which are exquisitely well-suited to data-intensive query processing--will soon either realign its go-to-market strategy around appliances or get out of the analytics market altogether. The performance advantages of a hardware-optimized column-oriented database over software-only rivals will be too pronounced for the latter to hold onto their market share. And though most appliance vendors currently eschew column-oriented approaches, preferring to tweak traditional row-oriented RDBMSs for multidimensional online analytical processing (OLAP), many will explore this alternative technique in order to eke out further performance improvements.

The growing demand for cheap analytic horsepower will also foster the development of subscription-based DW services, also known as “DW 2.0,” “Database 2.0,” “cloud databases,” and “on-demand databases.” Though not the first entrant in this new arena, Microsoft is the most prominent, having recently rolled out a limited beta of its hosted SQL Server Data Services (SSDS), which is slated for full production release in 2009. Under SSDS, Microsoft hosts a subset of SQL Server’s relational database management system (RDBMS) functionality in support of analytics as well as transactional applications. Though it has not yet specifically optimized SSDS for analytics, Microsoft has stated that it plans to evolve the service in that direction.

As it becomes available from many service providers, DW 2.0 will offer an ever-expanding supply of cheap, plentiful analytic horsepower. Over the coming decade, software-as-a-service (SaaS) providers will begin to offer feature-complete, subscription-based BI/DW services for high-performance, high-volume, complex analytics. These clouds will leverage the full virtualized, distributed, scalable, grid-computing fabric that Microsoft, Google, and other SaaS behemoths can bring to bear on data mining, performance optimization, and other compute- and data-intensive tasks.

Over time, we’ll come to take DW 2.0 for granted. We’ll call it up on demand, a utility for processing any and all decision-support tasks, large or small, throughout the business world or in our daily lives.

Jim

Wednesday, April 30, 2008

FOA v3 of 3

All:

SOA’s strength is in its inner abstraction, its paradigmatic focus on the goal of maximizing sharing, reuse, and interoperability of key corporate resources over networks, thanks to open standards.

SOA’s goals are laudable, but now we have the presentation, access, delivery, and socialization layers to consider. Socialization layer? You mean social networking? You mean wikis, collaborative bookmarking, and all of that Web 2.0 stuff that keeps on innovating so fast that no clear design patterns, hence no stable interoperability specs, can emerge?

How can we define standards to support sharing, reuse, and interoperability in an emerging network computing fabric that steadfastly refuses to settle down and decide what it wants to do when it grows up?

In which friends organize architectures, and failures only accelerate the push toward some simpler, less abstract, more practical architecture that totally works.

Whether or not we analysts have conceptualized it all in every fine detail in advance.

Jim

FOA v2 of 3

All:

SOA’s failure isn’t so much a fault of the vision, as it is a reluctance to recognize that any particular middleware implementation will soon be obsoleted by something much grander, and fuzzier.

Perfect example: SOA--which many of us have predicated on the notion of universal adherence to abstract interfaces that leverage XML and WS-* specifications--is gradually being abstracted in a broader paradigm that some have called Web-Oriented Architecture (WOA), which is essentially Web 2.0 plus Representational State Transfer (REST), and for which many of the most important high-level patterns, such as social networking, have no clear reference frameworks. WOA nelly! What’s to become of the messy but reasonably coherent SOA stack in a world where there’s no clear commitment to standardizing on interoperability specs that go much beyond bare-bones HTTP, HTML, and JavaScript?

Besides, WOA is primarily a phenomenon of the presentation, access, delivery, and socialization layer, a domain that SOA never seriously attempted to penetrate. WOA is exposing the inherent limitations of the SOA vision, which have been there from the start.

If you consistently acknowledge the limits of your vision, feasibility and flexibility considerations will inevitably open your architecture.

Jim

FOA v1 of 3

All:

SOA’s f*cked, some say, but I don’t hold with the naysayers.

Anne Thomas Manes of Burton Group has spread this notion that SOA’s a failure because few enterprises have come anywhere close to the nirvana of 100 percent service reuse. But my feeling is that that’s like saying democracy’s a failure because we’ve never come close to 100 percent voter turnout in any specific election.

If you like, you can posit a utopian ideal and then declare the world a failure because it hasn’t signed up to your vision. Or you can declare your vision a vision and then commit yourself to pushing the world in that direction little by little for the rest of your life. The open issue is: Do you see yourself living this vision till the end of your days? Can you live with the possibility that your vision, however much it gives your life purpose, is regarded by others as an impractical pet cause that sets you apart from the pack, and not in a good way?

If you’re prepared to push that boulder up a hill, if you’re so bold that you can’t rest till your vision is in everybody’s sights, then you’re a true visionary. And you might stand a chance of succeeding against all odds. Then you’re the Al Gore of SOA, weathering disdain with full knowledge that the facts bear you out and history will judge you kindly.

If a bare majority consistently sides with your vision, fate will ordain your architecture.

Jim

Tuesday, April 29, 2008

The R-Word Chronicles, Vol. *********

All:

There comes a point when a recession just starts to drag. You’ve long since absorbed the shock. You’re tired of being reminded of it all. You dread the flatness of every new day. You’re more than ready to flatline the whole sorry experience and move on.

We’re barely into our current recession and I didn't realize how sick I've become of it till this little bit of beauty popped into my browser: Was Data Integration a Factor in the Subprime Mortgage Crisis? In it, Tony Fisher of SAS/DataFlux tries, not very successfully, to blame the subprime crisis on lenders’ failure to load up on the magic elixirs of data quality and master data management--which, conveniently, are exactly what he’s selling.

Judge for yourself:

  • “Fisher: It is absolutely true that better data management, data quality, data integration practices could have help identify where things were headed.”
  • “Fisher: What happens today is that you can go online, fill out a couple of forms, click a few buttons and wait a few seconds, and all of a sudden it will come back and say that you’ve been approved. But today, that same type of scrutiny wasn’t (going on) leading up to the crisis. … one of the many reasons falls squarely on the shoulders of the lenders, because they were getting information from potential mortgage buyers, and they were not scrutinizing that data appropriately, not insuring that the data was correct, and that led to an inflated amount of risk the lenders were undertaking. So was the data there, could the data have been used? Absolutely, but it couldn’t be just the data. You had to put the data and the process practices in place together.”

Hey, wait a moment. That’s not a data quality issue, in the sense that it would be possible to fix it with data profiling, matching, merging, and standardization. Rather, it’s a loan quality issue, exacerbated by lenders’ failure to vet the acquired data against other sources prior to approving the bad loans. On their part, it represented a knowing, willful disregard for the risks associated with subprime loans, which were motivated by pure greed. Well, Fisher tries to have it both ways:

  • “Fisher: So, at the end of the day, a lot of the subprime mortgage crisis really did have to do with the lack of risk assessment. And risk assessment is something your data will indeed bring forward. All the information is in the data, but you have to use the data.”

Let’s get real here. My sense is the subprime lenders did indeed use the (junk) data to justify their (junk) lending decisions with full knowledge of the risks, which they, apparently, considered acceptable until it was too damn late. No amount of technology will keep you from shooting yourself in the foot if you’re so inclined. No matter how trustworthy your data, if it’s in the hands of untrustworthy decision makers, you're screwed.

Yes, I cover business intelligence, decision support, data quality, master data management, and governance risk compliance systems for a living. But I'm not a kool-aid drinker. I don’t imagine that these technologies automatically produce intelligent decisions for managing risk.

So, at the end of the day, people make stupid decisions, in unison, all throughout the economy, all the while thinking that what they’re doing is acceptable by prevailing standards. And they're quick to point their fingers elsewhere--such as at other people's failure to use some magical technological cure-all--to explain why the economy is suddenly starting to tank.

Hence, recessions. Hence the fact that we never learn.

Jim

Friday, April 04, 2008

The R-Word Chronicles, Vol. 8

All:

Ah, yes, the head honcho utters the taboo word in an official public statement, and so it now may be discussed in polite company. But with rhetorical tongs, such as air quotes, or excessively noncommittal “let’s leave this to the experts” hedging and hawing, or a rushed delivery strongly hinting at “let’s discuss the weather instead.”

What precisely is Papa’s delicate condition? “Federal Reserve Chairman Ben Bernanke said for the first time that the U.S. could slip into recession this year, using a word that other government officials including President Bush have gone to great lengths to avoid. ‘A recession is possible,’ Mr. Bernanke told a congressional committee Wednesday, citing turmoil in the housing and credit markets. He added, ‘We're slightly growing at the moment, but we think that there's a chance that for the first half as a whole there might be a slight contraction.’ His comments risk adding to economic gloom. But they also won praise from some economists as [blah blah blah] ...”

Speaking of hedging and hawing, how about that Bernanke with his “a recession is possible,” coupled with “we’re slightly growing,” and “we think that there’s a chance...for a slight contraction”? For an example of the same at a micro-economic level, in my bread-and-butter area of business intelligence (BI), check out Mike Schiff’s recent article “The Impact of a Recession on the BI Market.” It’s a good article, and what he says re the potential for BI to help enterprises weather a recession is pretty much the same as I argued in the Network World article that is also Vol. 1 of this thread. But, seriously, come on Mike, you have watered down the current economic situation into an exquisitely flavorless broth: “There has been much speculation about the effects of a possible recession on the business intelligence market and how this could adversely affect customer spending and associated vendor revenues. While caution is certainly advised, I suspect that these concerns are much too pessimistic and that a general business slowdown may have little detrimental effect on the BI industry. In fact, it might even result in increased BI spending.”

Getting back to the macroeconomic situation, it’s funny that the ideological pendulum has swung in the direction of tighter regulation to keep the US economy from drifting further into the doldrums. Even a conservative Republican administration is now advocating a crack-the-whip anti-laissez-faire approach to re-regulating the securities industry in the wake of the subprime mess (contrast this with Republican Herbert Hoover’s snoozing response to the stock-market crash of 1929, or Republican Ronald Reagan’s “get government off our backs” response to the persistent stagflation of the 70s/80s).

Even those recent regulatory initiatives that had started to fall out of favor are finding new life under stormy skies. “Even at the U.S. Chamber of Commerce's annual conference on capital markets this week, there was a marked shift in tone from last year, when Sarbanes-Oxley was blamed for making U.S. markets less attractive to overseas investors. ‘An increasing appreciation for the internal controls is emerging,’ Jim Turley, chief executive of accounting firm Ernst & Young, said at the Chamber conference, where many of the pro-business speakers said there may be a need for more regulation.”

Ah, yes, the new-old R-word: regulation.

That’s still taboo, isn’t it?

Jim

Wednesday, April 02, 2008

The R-Word Chronicles, Vol. 7

All:

Mostly, a recession is a spiritual slough that threatens to drag us down, a long overcast period that seems to drag on and on. Everybody develops their own strategy for coping with woes of indeterminate duration.

Pockets of hope--some of them fleeting--exist in every slow period. For starters, everybody knows that the clouds will clear reasonably soon--in a few quarters, or at the most a few years. Generally, we fancy ourselves with the notion that this rough patch won’t be all that deep, or all that long. Hence, recent headlines such as U.S. Recession will End in Q4, which take hope from the Bush administration’s and the Fed’s recent actions to stimulate the economy and ramp up regulation over a broader swath of the securities market in the wake of the subprime mortgage mess. But, quite frankly, I find myself unnerved by overly optimistic “government will come to our rescue” statements. Such as “these interventions are bound to have a positive impact on the economy.” And “these measures along with the renewed optimism about an anticipated change in presidential administration will pull the economy out the recession by Q4.” I hope these prophecies come true, but I have a degree in economics and know full well that government can just as easily mis-time its measures or otherwise destabilize an already shaky system.

Some draw hope from the globalization of the world economy, with steady demand growth in emerging markets such as India, China and Brazil offsetting any slack in the US and other established economic powers. Also, outsourcing, especially of IT-related jobs, will likely remain a growth industry, both offshore and domestically. Hence, the recent Computerworld article, Recession unlikely to curb H-1B demand, which notes that “[US] demand for foreign workers, including skilled software developers and other IT professionals, [is] still rising as economic conditions grow steadily worse.” It also notes that offshore IT outsourcing firms are going onshore to a degree to get around US visa limits, even in the face of a recession. “Last July, Wipro Ltd. announced plans to build a 1,000-worker software development center in Atlanta. And Tata Consultancy Services Ltd. said this month that it is opening a services delivery center near Cincinnati. Tata, which has more than 16,000 employees scattered at client sites in the U.S., plans to mostly hire locals -- initially, about 500 people -- at the new facility.”

And, of course, many take comfort in their ability to leverage diverse skills in order stay employable in tough times--especially if they’re an old codger like me who has been in the workforce for decades. Hence, a recent article in Computerworld, “Opinion: Building a recession-resistant career.” This opinion piece, written by an executive in an IT placement firm, dishes out the sort of all-purpose advice that applies in good times and bad, and at all stages in your career. “Those who continually update their skills and build their networks keep themselves in high demand when employers start tightening their belts.” And then it provides a list of IT skills currently in demand, most of them in app development and system admin. The piece also stresses “soft skills” such as problem-solving, business acumen and interpersonal communication to differentiate the most promising job candidates from the sullen, tangle-tongued uber-geeks of this world.

All fine and good, but nothing you’re not already doing out of sheer habit...I presume. Which leads me to one more point to close out this post. Since we’re dealing in the blindingly obvious, enduring recession--or any other streak of stormy weather--with your spirit intact is as simple as getting out of bed the same time every morning, looking at yourself in the same mirror, and telling yourself, a la an Al Franken character, “I’m good enough, I’m smart enough, and, gosh darn it, I’m employable.”

Or, hopefully, already employed. And hanging in there. Going out for coffee.

Jim

Friday, March 21, 2008

The R-Word Chronicles, Vol. 6

All:

One way that recessions stir anxiety is by making people worry whether they’ll be rendered redundant. One way this can happen is when our customers find that they can do what we do reasonably well all by themselves.

Even IT industry analysts worry about such things, and not just individually as human beings, but, collectively, structurally, as an industry. We wonder whether the Internet and all that it has wrought will empower customers to the point where they’ll be able to spot the competitive “disruptors” as well as we do. I addressed that issue today in my post to Forrester’s I&KM blog. One thing that Forrester doesn’t shy away from is examining issues such as this that hit close to home.

Speaking on behalf of all IT industry analysts everywhere, we like to think that our research skills, brainpower, and insights add value that we can take to the bank.

That we are, collectively, indispensable (some of us individually more than others, maybe).

That you, the user and vendor communities, are all the richer for engaging our services.

What you think?

Jim

Thursday, March 20, 2008

The R-Word Chronicles, Vol. 5

All:

Free is appealing in all economic climates, but especially in a cloudy one. So it’s predictable that free-ish IT options--no/low cost, no/low risk, no/low commitment--will be given greater attention in times like these. Open source, of course, but also SaaS (aka “cloud computing”).

Check out my latest Forrester blog post on the growing SaaS phenomenon called “cloud databases” (aka Database 2.0). I take a special look at Microsoft’s new SQL Server Data Services offering. I also discuss Panorama’s new gratis, lightweight, BI/OLAP-engine add-on to Google’s hosted apps, and speculate on the possibility of Google leveraging its GoogleBase “cloud database” into a full-blown Data Warehousing service in the cloud. See Boris Evelson’s post, “Free BI!,” for further details on the Panorama/Google announcement.

Neither of these Database 2.0 initiatives--nor those from startups such as Trackvia, DabbleDB, and Zoho--are ready for enterprise primetime OLTP and OLAP applications. As I note in the Forrester post, Microsoft’s initiative is just a beta, and only provides a subset of the functionality of SQL Server, whereas GoogleBase is simply a depository for data that companies would like crawled/indexed by Google’s search engine.

The cloud database space is just beginning to form larger droplets that may some day irrigate the planet below. Enterprises in a budget crunch may choose to use these database services on a pay-as-you-go basis, though they’re far from a no-cost, no-risk proposition. It’s not clear whether any of the Database 2.0 startups will survive, nor whether Microsoft and Google are prepared to go further with their initiatives. Also, Oracle, IBM, and other database vendors have not indicated whether they plan to offer similar services.

The smart money says they will, but, once again, that’s not a sure bet. Users all over creation are feeling far too much risk right now. There are far too many bleeding economic indicators in everybody’s dashboards.

Jim

Thursday, March 13, 2008

The R-Word Chronicles, Vol. 4

All:

A quick shout-out back to the always-worth-reading Mr. Tony Baer of OnStrategies, who has just blogged in response to my most recent Network World column, incorporated into Vol. 1 of this thread, on BI as a tool for weathering recessions.

OK, I’ve had a bit too much caffeine and I hyperlinked that last sentence to kingdom come. But there really is a point to this shout-back, above and beyond me finally beginning to respond directly to my peers across the blogosphere.

With all due respect, Tony, I didn’t argue that, per your interpretation, that “BI provides tools that are not likely to be sacrificed in a downturn.” Don’t think I’m hair-splitting on this. Please hang with me a sec while I clarify.

As you noted, I assumed, in writing that article, was that there are no sacred cows in a business downturn. But I also assumed that that every investment--including BI--is a likely candidate for chopping.

In that article, what I issued was a challenge to IT professionals to safeguard their companies’ investments in BI, data warehousing, etc in the face of fiscal austerity. Here’s the core thesis of the piece:
  • “For IT professionals, the greatest test may be in how well they safeguard their organizations' core analytics assets from budget cuts during down economic times. The core issue is: How can you optimize your end-to-end analytic environment -- such as, control costs of your business intelligence applications, predictive analytics applications, enterprise data warehouses and other key infrastructure -- without impairing service levels or limiting your company's ability to leverage analytics into new business opportunities?”
By the way, yes, I do cover BI for Forrester (as I noted in an earlier post: Boris Evelson is our lead analyst on BI; I’m lead on data warehousing, plus partner-in-crime with Boris in what amounts to a Forrester dynamic duo on all things BI, a space so broad and deep that one analyst alone can barely do it justice). But, be that as it may, I’m under no illusion that this technology’s value proposition is so cinch-tight that corporate investments in BI can’t get swept out to sea in a perfect business storm.

While I’m on the topic of my Forrester coverage areas, check out our Information and Knowledge Management Blog. I post to the Forrester I&KM blog too, of course, along with all of my colleagues. FYI, in terms of upcoming BI research we’ll be publishing in coming months, Boris is taking the lead on our latest BI Wave, and I’m lead on our forthcoming BI market sizing study.

Back now to the topic of the r-word and its impact on I&KM. Check out Kyle McNabb’s recent post on the impact of recession on enterprise adoption of enterprise content management (ECM) solutions.

If you’re a Forrester customer, you should also look at Andrew Bartel and Merv Adrian’s recent teleconferences, “Vendor Market Strategists: How To Prepare For A Downturn” and “CIOs: How To Prepare For A Recession.” Also, Connie Moore did a teleconference entitled, “Information & Knowledge Management Professionals: How To Survive (And Even Thrive) In Times Of Economic Uncertainty.”

Another bread-and-butter plug: Role-focused competitive, market, and best-practices intelligence are always essential, in all economic climates. That’s where industry analysts fit into the IT world’s food chain. Deep feeds of intelligence from leading industry analysts--that should be sacrosanct, if nothing else is, in all enterprise and vendor budgets.

Got that? Role--that’s an R-word. Research--another one. Relevance--a third. This could go on forever but won’t.

Thanks for your indulgence.

Jim

The R-Word Chronicles, Vol. 3

All:

For those who’ve made our living in IT, and who are now almost entirely ensconced in a post-millennial frame, one recent recession looms largest. That recession, the 2000-2002 unpleasantness (aka, the “Dotcom Bubble,” which I like to think of as the Millennium Recession), was, for some of us, transformative.

Just as I personally experienced the Great Recession of the 1970s from a regional perspective--that of a long-declining Midwestern manufacturing city--many slogged through the most recent US economic downturn in a puddle of pure metropolitan pain. Of course I’m talking about Silicon Valley. Though I’ve personally been residing in the National Capital Region for the better part of a quarter-century, I felt the pain of those years vicariously through the sufferings of my Bay Area professional friends, associates, and virtual acquaintances who suddenly saw their jobs, careers, companies, business models, finances, and lives crash--or, at the least, veer wildly off course.

Though most of my Bay Area connections recovered reasonably well as the decade wore on, the memory of dashed dreams still stings. No metro area felt the recessionary whomp as hard as the fine folks of that scenic, seismic peninsula. For the IT world generally--and for Silicon Valley as the epicenter of the Internet-driven economic revolution of the 1990s--that nasty bit of business was the rump-end of what economist Joseph Schumpeter called capitalism’s “creative destruction.” It was akin to the sort of creative destruction that global competition has been visiting on Detroit for the better part of my life.

Essentially, the year 2000 turned out to be a sort of temporal fulcrum, counterbalancing the wildly creative bubble years of the extremely late second millennium with a resounding recessionary crunch. Was this a Y2K bug of another sort altogether? Well, it didn’t truly crash the new Internet-centric world economic order, nor did it destroy any of the most fundamental innovations that surfaced in the 1990s. Rather, the Millennium Recession simply introduced further entropy into an already chaotic but very vital system. It atomized and scattered many ambitious ventures, business models, technologies, teams, brands, innovators, etc hither and yon. It spurred them to recombine into more self-sustaining business concerns. Hey there, Schumpeterians, if we’re looking for another euphemism, call this messy process “constructive deconstruction.”

I take little comfort in the prevailing perception that the Washington DC area is somehow “recession-proof.” It is not. For starters, it isn’t (federal) budget-proof. The federal budget isn’t politician-proof. Politicians aren’t election-proof. And elections aren’t recession-proof. Neither are tax revenues, or federal budget appropriations, or IT-centric projects that depend on those appropriations, or the IT professionals who people those projects and who happen to reside in and around this particular mid-Atlantic estuarial basin. It’s the economy, stupid (remember: a candidate got elected president in the midst of the early 90s recession on the strength of that slogan).

But the Washington area--in particular, northern Virginia’s high-tech sector--is doing OK. I should note that we of the DC-area high-tech community have become much more diversified in the past quarter-century, in terms of whence our bread is buttered. Case in point: I’ve been in IT since the mid-80s, all of it in the DC area, and have drawn income from federal work (i.e., contracts) in only a half-dozen or so of those years.

Consequently, I’m semi-abstracted, economically, from my current metro area on a day-to-day business. As an IT industry analyst, my personal geographic siting is semi-irrelevant. My schedule is torn between the demands of people in many time zones. I’m delivering value to customers in many countries. I’m continuously feeling the mixed pain-pleasure equation of every cresting or crashing solution area in every IT market that I cover, wherever those solution providers hail from.

And though my employer at any point in time is a business with a specific HQ somewhere on the globe, each of these firms could just as easily be sited elsewhere and carry on much the same. Many analysts in many firms work remotely and virtually, rarely needing to come to HQ for any operational reason. Over the course of my analyst career, I’ve worked remotely for firms HQ’d in the Salt Lake City area (hands down--the best view from HQ windows in my career--lovely craggy snowy mountains), the DC area (not far from my home/office, as the commute-weary crow might fly--HQ view of a shopping center parking lot), and, now, the Boston area (right next to an awesome world-class science/technology/engineering university--HQ view of semi-nondescript dense-packed office buildings nearby). The local impact of a recession on those particular metro areas concerns me about as much as the jobless rates in, say, Detroit or Jakarta.

Interestingly, I saw an article this morning about Silicon Valley’s primary high-tech bellwether Google, reporting that they’re attempting to diversify by pursuing more federal business. Cool, Google, though beware that the federal budget’s due for a significant dip under the next president, for three main reasons.

For one thing, whichever of these three individuals becomes our next commander-in-chief, they’ll almost certainly pull back our ground forces from Iraq and Afghanistan, and cut the defense budget accordingly.

Second, the current economic expansion is already around 6 years old...the same as the Reagan-presided (and defense-fueled) boom of the 80s and the Clinton-presided (and tech-fueled) one in the 90s--which means that federal revenues, hence spending, are going to take a hit.

Third, I get the sense that the US will find itself increasingly hard-pressed to compete against the one-two-three-four Asian punch of Japan, China, India, and Indonesia. Looked at as “metro areas” in an Asian economic “megalopolis,” these countries increasingly have their ducks in a row: vigorous R&D, IT, services, and manufacturing sectors with ample resources, stable governments, rapidly improving infrastructure, educated disciplined workforces, and a half-the-world-population “internal” market(s).

Something structural is going on that could permanently impact the competitiveness of this “metro area” we called the United States of America, and of high-tech solution providers that operate primarily in the lower 48. Any cyclical 2008-20** blip of a US recession could mask that greater concern. What I saw growing up in declining Detroit tells me that some metro areas don’t recover from structural weaknesses for a long, long time. If ever.

As I said, some recessions are transformative--but not to the benefit of everyone upon which the structural transformation is working its magic.

Personal bottom line: I’m no more wedded to the US than I was to Detroit. It’s an accident of fate, really, where you get planted. Fortunately, homo sapiens has handy skills and opposable rules of thumb.

Plug for my hometown: Detroit has casinos now. Hither there and gamble, if you’re so inclined.

Jim

Tuesday, March 11, 2008

The R-Word Chronicles, Vol. 2

All:

For those of a certain generation, recession was formative. So its coming now isn’t so much a cause for alarm as a reminder to stay calm. We have the inner resources to persist through tough times.

I’m originally from Detroit, a city that sometime in the 1970s entered what has turned out to be more or less a permanent recession. I like to think of that decade as the “Great Recession” (albeit with legal booze). For me, the 1970s was the decade of high school and college, with all the usual anxieties that attend those phases of life. It was also a decade book-ended by the deaths of my mother and father. For our country, it was the decade of stagflation, an abandoned war of attrition, a political scandal that toppled a president, an energy crunch precipitated by geopolitical nastiness, a nuclear meltdown, a mass-suicide doomsday cult, and an embassy hostage crisis that toppled yet another president.

Toward the end of the 1970s, between my junior and senior years in college, I had an internship at an urban coalition in Detroit. The group, New Detroit Inc., was formed after the riots of the late 60s to rebuild the city’s torn economic, political, and social fabric through engagement with neighborhood organizations and with business and government. That summer, I was a policy analyst--my first analyst job (I was an economics major, by the way). In addition to representing the coalition at community meetings throughout the city, I had a research project. My job was to estimate the likely financial impact on Detroit, in foregone federal entitlement dollars, from an undercount in the upcoming 1980 census. If I recall correctly, I took 6-7 weeks to produce a defensible estimate: $50 million lost to Detroit if the city’s residents were undercounted at the expected rate of (I forgot what that rate was).

But for me, the real take-away from that summer was a firmer sense that Detroit was in for a very long structural decline. It wasn’t so much that the city’s population would be undercounted. It was more a matter of the city being far too overpopulated for its ever shakier economic base. Southeastern Michigan’s automobile-centric manufacturing employment base had peaked in 1955, three years before I was born, and had not been replaced by any other industries that could absorb the great number of unskilled and semi-skilled citizens who seemed to be waiting, biding their time, until some new great wave of industrial salvation restored them to some semblance of unearned, under-educated middle-classedness. Then and now, people just tend to stay put in Detroit, in ever more decrepit jobs, houses, and neighborhoods. There was and is precious little new business formation. The labor unions long ago became a reactionary force, an obstacle preventing Detroit’s antiquated industrial base from morphing with the times.

When any recession approaches, I ask myself to what extent it’s a cycle that will soon reverse itself, and to what extent it masks a deeper-seated structural change in the economy. Fortunately, the US economy bounced back nicely from the stubborn recession of the late 70s and early 80s: I’ve spent most of my adult life and career in a growth economy, in a different metro area (Washington DC) from where I grew up, and in a different industry (IT) from what I imagined in college. I thought I’d end up an economist with some Rust Belt manufacturing firm, enlisted in some ongoing defensive effort to fend off “Japan Inc.” or whatever other, more dynamic, more adaptable economic challenger threatened my employer’s livelihood. I expected to live my work life on the defensive, like the auto industry troglodytes (and their offspring) all around me in my salad days. (For the record, my father was a salesperson with mainframe computing pioneer Sperry Univac, though he personally sold electromechanical filing systems, office furniture, microfiche machines, and other non-computing products).

All I know, from all my experience, is that no one and nothing is truly “recession-proof.” A big part of the 1970s was observing my father’s 20-year career with Sperry come to an end, not due directly to any recessionary layoffs or what have you, but due mostly to Sperry continuing to lose ground to IBM and others in the mainframe computer biz. For my father, who was a great salesman (if his numbers tell the story, and they should), the 1970s was a structural breakpoint for his bread-and-butter. They couldn’t sustain their 1950s-1960s heyday. Come the 1980s, they merged with Burroughs, which begot Unisys, which saw a hasty retreat from computing hardware and software altogether, becoming the professional services firm they are today. Without a doubt, a similar structural shift will carry some of the 1990s’ former software juggernauts into the next era, while others will disappear as going concerns (or brands, which amounts to the same thing--lose your former identity, lose your footprint in cultural memory).

Nothing’s recession-proof, but, if we keep calm, we’ll find the inner resources--much of it purely spiritual--to persist as going concerns through cyclical and structural shake-outs. A recession is as much a collective spiritual funk as an officially certified downturn in leading economic indicators. Just as, earlier in this post, I strung together a litany of semi-coincidental nasty events that are forever stained into my recollection of the recessionary 70s, we can each summon up a peeve list of crap-happening-now, and link it to whatever recession may or may not take hold shortly.

But what’s the damn point. Just stay calm. If you’ve spent your entire career in the IT world, you’ve been building up immunity to cyclic and structural shakeouts, which happen with such fervent regularity, in “Internet time,” that they don’t spook us. Unlike the auto workers and their kids who always pinned their futures on the certainty that there would be a cushy unionized job awaiting them “on the line at Wixom,” we hold no such illusions.

We’ve spent our lives morphing, and holding on, awaiting the next shift, and the one after that too, prepared for whatever may come. Keeping spirits intact.

Jim

Friday, March 07, 2008

The R-Word Chronicles, Vol. 1

All:

My BI/analytics-related take on this new collective preoccupation, which crystallized in my mind while on a roadtrip the week before last, visiting a couple of vendors in this space:

************************************

http://www.networkworld.com/columnists/2008/030408-kobielus.html
Recession fears put focus on business intelligence

Above the Cloud By James Kobielus , Network World , 03/04/2008

Modern economies are hypochondriacs of the highest order. They check their own pulses at obsessive intervals, searching for symptoms of weakness, ever ready to fend off decline through stimulants that may or may not help the situation.

In the past few months, we've all been encouraged to scout the economic horizon for signs of the dreaded "R" word -- a malady so dire, apparently, that its very name may not be uttered in polite company. Throughout the IT world, we've been scrambling to put together contingency plans for dealing with a down economy, if and when it materializes. Everyone -- IT vendors and users alike -- has been hedging their bets and watching their pocketbooks, just in case.

Recently, the IT industry has started to latch onto a curious notion: that business intelligence and performance management applications can help users weather whatever rainy day may or may not come. In other words, analytics applications are increasingly being positioned as tools for determining what to cut, trim and scale back from operations while, hopefully, minimizing adverse impacts on the business.

There is some validity to this viewpoint, though one can't help thinking it has come into vogue -- at least in part -- through a self-serving push by vendors of analytic applications. What's undeniable is that many enterprises are better prepared than ever to deal with economic uncertainty, having invested heavily in business intelligence and analytics over the past few years.

They are well-armed with reporting, scorecarding, dashboarding, forecasting, what-if modeling, interactive visualization, and other analytical tools for sifting through operational data and identifying promising areas for business optimization under tightening economic constraints.

Indeed, the business intelligence industry's recent emphasis on financial analytics suites has given chief financial officers (CFO) an increasingly sophisticated tool for determining, with surgical precision, where to apply the budget scalpel. By the same token, many human resources directors have powerful human capital analytics for identifying which positions can safely be eliminated, and which hiring decisions may be postponed indefinitely, when the economy goes sour.

Likewise, supplier relationship management analytics tools let companies understand their options for dropping marginal vendors in favor of those that can offer preferential pricing. Still other analytics tools promise similar optimization benefits across the full range of business functions.

Clearly, analytics is a key asset in the ongoing business optimization struggle, both in good times and in bad. There are many business analytics initiatives that can help organizations consolidate, spend modestly and tweak existing processes within fiscal constraints. In fact, one of the key tests of any analytics-driven corporate business model is its ability to deliver superior results in eras of austerity.

For IT professionals, the greatest test may be in how well they safeguard their organizations' core analytics assets from budget cuts during down economic times. The core issue is: How can you optimize your end-to-end analytic environment -- such as, control costs of your business intelligence applications, predictive analytics applications, enterprise data warehouses and other key infrastructure -- without impairing service levels or limiting your company's ability to leverage analytics into new business opportunities?

The following are the most fundamental pointers for protecting your enterprise's analytic core in a down economy:

• Outsource as much of your business intelligence, data warehousing, performance management, and other analytic applications as you can to software-as-a-service providers who can provide it to you on a pay-as-you-go basis. This eliminates the need for you to manage it all yourself from dedicated, in-house data centers with dedicated, full-time staff of your own.
• Single-source as many of the functional components of your analytic infrastructures as makes sense from vendors of comprehensive, all-in-one suites of business intelligence, data warehousing, data marts, extract transform load, data cleansing and other critical components. That way, you can obtain a bundled, integrated solution at a lower cost than if you procured these components separately from several vendors.
• Consolidate as much of your enterprise data warehouse (EDW) environment as you can--with associated reduction in server hardware, software licenses, and full-time operations and support staff -- into fewer, more scalable, more energy-efficient, more cost-effective data centers.
• Migrate as much of your less-in-demand operational data to more cost-effective "cold" storage (offline, near-line) as you can, while only keeping the most in-demand data in more expensive "hot" (online) storage in your EDW.
• Offload more of your high-volume online analytical processing (OLAP) workloads to the new generation of data warehouse appliances, which can accelerate query processing at a fraction of the cost of traditional data warehouses while also freeing up EDW processing/storage capacity for other workloads.
• Virtualize as much of your business intelligence, OLAP and other workloads to grid, massively parallel processing, and other scalable, distributed processing architectures, so that you can run more of this processing on inexpensive commodity servers, share workloads across available CPU and storage resources. And move these loads from mainframes and other "big metal" platforms that are more optimized to online transaction processing.

None of these belt-tightening recommendations should be radically new or unfamiliar to IT professionals. These guidelines should not be regarded as mere IT contingency plans for coping with budget austerity. They are best practices that analytics-driven organizations should implement, in both boom and bust periods, to strengthen their business core.

************************************

"Strengthen their .... core"--that's certainly a yoga allusion, for those who are keeping score. Let's keep the metaphor going. If you strengthen your business analytics core, you develop a six-pack of abs/apps, and you can tighten your fiscal belt comfortably etc. Hokey smokes, Bullwinkle!

Jim

Friday, January 18, 2008

personal My Forrester Research coverage areas--and linkages with others'--plus some new thinking on it all

All:

Hi. Yes, I plan to continue blogging to this, my personal pulpit, in addition to contributing to Forrester's blog. As you can well imagine, I've been busy as can be this month orienting to Forrester. In fact, I've been in Cambridge much of this month (including right now) getting orientated, acclimated, acculturated, caffeinated, invigorated, accelerated, and downright exhilarated to all things Forrester. Seriously great team I'm with.

For starters, you might have noticed that Boris Evelson and I co-blogged this week, under Forrester auspices, on the Oracle/BEA and Sun/MySQL acquisitions. So you can read it at http://blogs.forrester.com/information_management/business_intelligence/index.html. No doubt, you've also read various things that Boris, myself, and other Forrester analysts said to the many IT trade press reporters who called us all-day Wednesday about those stories.

So I've been busy from the get-go at Forrester. In coming weeks, our customers will see my first published Forrester document: on the evolution of data warehousing (DW) appliances, and on best practices for evaluating, deploying, and managing them. Appliances in all their diversity have become the dominant industry approach for rolling out purpose- and performance-built solutions in support of online analytical processing, bulk data loading, and other core DW functions. Rest assured that I'll present a multilayered definition of this go-to-market approach that does justice to the range of vendor implementations. I'll also, in subsequent reports, delve deeper into such key enterprise requirements as real-time DW, master data management (MDM) applications in DW implementations, and convergence of structured, semi-structured, and unstructured data in DW environments. As you can well imagine, I'll collaborate closely with such esteemed analyst colleagues as Boris (our BI guru), Rob Karel (data integration, data quality, MDM), and Noel Yuhanna (DBMS).

I'm teamed most closely with Rob and Boris. From a corporate hierarchy viewpoint, we're all in Forrester's IT Client Group, under that group's Information and Knowledge Management (I&KM) orbit, and in that orbit's Data Domain, focusing primarily on structured data. Please note that we have several I&KM colleagues, including Kyle McNabb, Barry Murphy, Craig Le Clair, and Steve Powers, in the Enterprise Content Management domain, who look at unstructured and semi-structured info. And I shouldn't overlook other important analysts in I&KM, including those who cover collaboration (Erica Driver, Rob Koplowitz, Connie Moore, Colin Teubner, and Claire Schooley) and information access (Matt Brown, Ken Poore, and Leslie Owens).

Re Rob and Boris, it makes perfect sense for us to be in same domain, because DI, DQ, DW, and BI are inextricably linked disciplines that are separated by semi-permeable membranes. Which reminds me: I've long used the following definitions to distinguish the conceptual "demarc" between them:
  • Business Intelligence (BI): BI includes all tools and runtime components necessary to provide actionable information, insight, analysis, and decision support to business users. Information may be retrieved into BI environments from one or more repositories, including diverse databases, data marts, data warehouses, operational data stores, document management systems, and online transaction processing systems.
  • Data Warehousing (DW): DW includes all tools and runtime components necessary to consolidate structured master data into subject-oriented, integrated, non-volatile, time-variant repositories under unified business governance. DW environments consolidate master data from source data stores through various DI approaches and govern its controlled distribution to various operational data stores, data marts, access databases, and BI environments.
  • Data Integration (DI): DI includes all the tools and runtime components needed to retrieve, extract, and move data from origin repositories; validate and transform the data; and deliver it to target databases, data warehouses, data marts, and applications. The DI marketplace also includes data quality (DQ) solutions for profiling source data; matching, merging, and cleansing that data; and augmenting it with additional, related data.
Which also reminds me: the membrane around my coverage area also includes a few technologies that I've traditionally included under BI (but Boris assures me that he'd prefer I continue to deepen my established research focus on them). Here they are, plus my still-wordy scoping definitions (yes, I'm working on elevator pitches....these ones technically qualify if I plan to use them in the Burj Dubai):
  • Predictive Analytics/Data Mining: Predictive analytics uses statistics-powered data mining and interactive visualization that enables forecasting and assessment of the likelihood of future events and trends, in order to support forward-looking decision making with regard to strategic planning, corporate development, sales and marketing, product development, pricing and packaging, customer service, and other critical matters.
  • Complex Event Processing: CEP uses low-latency middleware and interactive visualization that enables continuous monitoring, aggregation, correlation, filtering, and presentation of diverse external and/or internal events surfaced from operational applications, business process management systems, databases, and other sources, in order to support flexible, real-time business response and proactive coordination.
To a great degree, predictive analytics/data mining depends on DWs--i.e., hub-and-spoke DW environments in which there are data marts to support access, storage, scoring, loading, cleansing, and other life-cycle functions on structured analytical data sets. However, CEP would seem, in many real-world deployments, to do without DWs altogether--and require a more distributed, federated, real-time, low-latency, end-to-end event-stream processing middleware fabric.

Essentially, a traditional DW operates in store-and-forward mode, introducing latency into the delivery of data to BI environments. Most of today’s DWs have been optimized for specific latency-producing operations: extraction, transformation, and loading (ETL) of data from operational DBMSs; retention of that data in persistent repositories; and downstream retrieval of that stored data into reports, graphical dashboards, multidimensional online analytical processing (OLAP) cubes, and other BI outputs.

DWs can be re-architected to support real-time BI. In fact, most DW vendors have already begun to address these requirements in their products. At heart, doing so requires that DWs be reconfigured to also serve as real-time application-layer data “routers” (in a broad unconventional sense of that term). For example, Teradata’s “active DW” approach adds support for near-real-time ETL and data delivery. Just as important, the vendor has added the policy-driven event detection, processing, and notification features needed to manage the flow of real-time events between data sources and consumers, as brokered through the DW. It has also built into its DW environment the availability, scalability, performance optimization, and dynamic workload management features needed to monitor, sustain, and guarantee minimal latency on data throughput out to BI applications.

Though organizations are beginning to use active DWs for real-time BI, no one is seriously considering deploying them as general-purpose, application-layer routers. Mainly this is because DWs are usually deployed in hub-and-spoke configurations and thus potentially can become significant bottlenecks. Some in the industry have proposed DW federation to alleviate the potential bottleneck, but most federation scenarios are still fundamentally hub-and-spoke in their reliance on common ETL tools, metadata repositories, and data staging areas.

All of which sounds like fodder for future Forrester papers. I'll keep you posted as I firm up my research calendar over the next several weeks.

Thanks, by the way, for your kind e-mails etc. Nice to know people value my expertise. I won't disappoint.

Jim

Tuesday, January 01, 2008

personal James Kobielus joins Forrester Research

All:

I have joined Forrester Research as senior analyst for data warehousing. You can reach me there at jkobielus@forrester.com or 703-340-8134.

Jim

Friday, December 21, 2007

forest-or-trees Expert Predictions for 2008

All:

Wow...quite a year....while Chris Butler and the Waitresses perform "Christmas Wrapping" on www.kexp.org and Jason behind me taps out the first draft of the play he'll also direct a couple of months from now, let me Janus-wise opine on the year gone by and the year to come....'twas fine of Jeff Kelly to spur this forest-or-trees roundup of my bread-and-butter (and has it really been three years since the Boxing Day Tsunami?):

**************************

BUSINESS INTELLIGENCE (BI)

  • BI becoming SOA’s crown jewel: The past year has seen a rash of headline-grabbing M&A deals in the BI arena, with Oracle’s acquisition of Hyperion, SAP’s deal for Business Objects, and IBM’s pending takeover of Cognos—not to mention acquisitions of smaller BI and corporate performance management (CPM) application vendors by most of those firms. It’s far too easy to misinterpret these recent events as just more of the same M&A-stoked empire-building that we’ve come to expect from large IT solution vendors. What’s driving this recent industry consolidation—which is sure to continue in 2008--is growing vendor recognition that BI is the crown jewel in any comprehensive service-oriented architecture (SOA) solution portfolio. Though Oracle and SAP (and, to a lesser degree, IBM) already had decent BI wares in their respective SOA portfolios, none of them were on any enterprise’s short list of name-brand BI solution providers—until, that is, each of them decided to grab a leading BI pure-play. SOA suites cannot be considered feature-complete unless they incorporate a comprehensive range of BI features.
  • BI evolving into tailored business analytics: CPM—sometimes called “business analytics”—is rapidly becoming a key competitive front in the BI wars. Increasingly, BI/CPM vendors are offering tailored solutions for a dizzying range of horizontal business requirements and vertical industries. Vendors’ continued profitability also hinges on their ability to provide the professional services necessary to create, customize, and support business analytics for each vertical industry’s and specific customer’s unique requirements. Without a doubt, we’ll see further verticalization of product and service offerings by CPM vendors in 2008, which will provide a necessary hedge against the inevitable creep of commoditization into such horizontal analytics segments as financial, human resources, sales and marketing, and supply chain management.
  • BI going truly real-time through complex event processing: Complex event processing (CEP) promises business agility through continuous correlation and visualization of multiple event-streams. However, CEP has heretofore been conspicuously missing from the mainstream BI arena, necessitating stovepipe CEP implementations that are only loosely integrated with enterprises’ existing visualization, reporting, dashboarding, information modeling, metadata, and other BI infrastructure components. That will change big-time in 2008, as most leading BI vendors start to partner with CEP pure-plays, or acquire them outright, in order to strengthen their support for real-time event-driven applications. We expect to see SAP/Business Objects, IBM/Cognos, Oracle/Hyperion, SAS Institute, Microsoft, Information Builders, and MicroStrategy venture into the CEP arena in the coming year. Likewise, it’s very likely that the newly independent Teradata, which has taken the lead in real-time data warehousing (DW), will snatch up a CEP vendor to build out its real-time BI portfolio.
  • BI bundling with DW appliances: Appliances have even begun to take up permanent residence at the heart of the enterprise data center: in the DW and BI infrastructures. Increasingly, vendors are focusing on integrating, packaging, and pricing their DW/BI products as pre-configured, modular appliances for quick deployment. These appliances consist of processing, storage, and software components that have been prepackaged, preconfigured, and pre-optimized for core DW/BI functions such as multidimensional online analytical processing (OLAP) queries, bulk data loading, and online archiving. The past year saw a growing range of DW vendors—including such DBMS powerhouses as IBM, Oracle, and Microsoft—reorient their DW/BI go-to-market strategies around the appliance model. In turn, leading BI vendors such as Business Objects and Cognos made a big push into the appliance arena. In 2008 and beyond, more and more DW vendors will pre-integrate BI solutions—their own and/or those of their partners—into their appliances. Increasingly, DW/BI appliances will be tailored, packaged, and priced for many market segments and deployment scenarios.
  • BI goes collaborative: Collective intelligence is an organization’s most precious asset. Traditionally, the BI industry has offered little to directly address one of the most critical components of group IQ: the collaboration environment. Instead, most BI applications focus on delivering targeted reports, analytics, dashboards, multidimensional visualization, and other key data to individual end users in isolation, rather than to larger business teams. In the past year, though, the BI industry has begun to roll out more collaboration features in their products—such as Microsoft with their new Office PerformancePoint Server 2007 solutin--or, at the very least, to begin talking about new collaboration features to expect in the coming year. In 2008 and beyond, we expect to see the BI, collaboration, and knowledge management segments converge. Likewise, we expect to see such interactive Web 2.0 technologies as AJAX, blogs, wikis, and social networking revolutionize the BI experience. Many BI vendors now realize that decision support environments should allow users to access intelligence wherever it may reside, be it in data warehouses or in the heads of remote colleagues.

MASTER DATA MANAGEMENT (MDM)

  • MDM vendors consolidate: Recent M&A activity—such as SAP/Business Objects, Oracle/Hyperion, IBM/Cognos, and Microsoft/Stratature--can be viewed as driven by vendors’ need to assemble more comprehensive solution portfolios to manage diverse master data sets and feed them into enterprise BI/CPM environments. In 2008, vendors will--through strategic acquisitions, partnerships, and internal development--assemble MDM solution portfolios that encompass best-of-breed solution elements in data integration (DI), data quality (DQ), DW, DBMSs, cross-catalog hierarchy management, pre-built domain models, data modeling and mapping, and data governance (DG). We expect to see such leading MDM pure-plays as Siperian, Initiate Systems, and Kalido be acquired by larger vendors looking to build up their MDM portfolios.
  • MDM vendors converge their platforms: Some of the recent industry consolidations have brought together former rivals who each have their own MDM solution portfolios that will need to be converged in 2008 and beyond onto a common platform. IBM, which acquired two MDM vendors in the past few years, has already pre-announced a converged new MDM solution that will be generally available in the first quarter of 2008. Oracle, which acquired Hyperion’s financial data-hub MDM solution in 2007, is likely to converge that offering with its pre-existing customer data integration (CDI) and product information management (PIM) MDM offerings on the Fusion Middleware platform in 2008. Likewise, we expect to see SAP begin to bring some important MDM-enabling Business Objects technology—especially strong data profiling and cleansing—into its established NetWeaver MDM offering.
  • MDM vendors differentiate through prepackaged solution accelerators: MDM projects are often complex, costly, and time-consuming. Recognizing this barrier to user adoption, vendors have increasingly sought to lower total cost of ownership through prepackaged MDM solution accelerators, sometimes known as domain models or templates—for CDI, PIM, and various vertical application domains. The leading MDM vendors—such as IBM and Oracle—rolled out and enhanced solution accelerators in 2007, and this trend will extend to all vendors, large and small, in 2008 and beyond. These solution accelerators consist of packaged master data definitions, schemas, models and objects, plus data governance infrastructure necessary to tailor an MDM or DW environment for a particular horizontal application (such as CDI, PIM, and financial consolidation) or vertical, industry-specific deployment (such as retailing, financial services, consumer packaged goods or healthcare).
  • MDM becoming more of a services than a product market: Even with prepackaged solution accelerators, there is no such thing as a truly shrink-wrapped, plug-and-play MDM solution. In practical terms, MDM often refers to a target enterprise architecture that can be complex, costly and difficult to implement and administer. Consequently, enterprises must engage professional services organizations to guide their MDM projects and operations every step of the way. In 2007, MDM solution vendors continued to build out their own consulting and systems integration (SI) organizations and to cultivate their partnerships with leading professional services firms that have expertise in CDI, PIM, and other key MDM applications. We expect to see this trend accelerate in 2008 and beyond, as professional services leads the way in most vendors’ MDM go-to-market strategies, and as these domain experts become the primary content providers to vendors’ precious MDM solution accelerators.
  • MDM deployments become more decentralized and virtualized: Traditionally, MDM has been deployed in a centralized fashion around the enterprise DW. However, 2007 saw more and more vendors stress more decentralized, virtualized deployment models for MDM, such as IBM with its focus on “multi-style, multi-form, multidomain” deployments. We expect this trend accelerate in 2008, as vendors respond to users’ demands for life-cycle management of master data sets across federated environments. Increasingly, users are requiring flexible MDM environments that allow them to deploy master data sets in centralized or federated topologies, while retaining unified, SOA-based DG across their enterprise service bus.
**************************

Next up, new vistas in Data Warehousing. More on that in the next few.

Jim

Tuesday, November 20, 2007

ink-impressed tek-biz bound-bulk-pulp review “The Limits of Privacy” by Amitai Etzioni

All:

With this present post, I’ve now reduced the still-gotta-finish-this-book stack to an illustrated history of Peanuts (key take-away: this classic 50-year strip hit a new plateau when Snoopy, in the late 50s, started to stand on his hind legs, dance, verbalize, and serve as wild-card for any flight of Schulz’ imagination); a detailed history of Fairfax County, Virginia (key take-away: the site where my wife and I work out was, before the Civil War, a large day-laborer market for freed slaves and for slaveholders seeking to rent out their chattel); and Joseph Campbell’s “Myths to Live By” (key take-away: he was a devotee of Carl Jung, and he didn’t much care for hippies). No…I won’t blog further on those titles (or will I?).

Now to the reviewed book of the day. “The Limits of Privacy” came free from the author, a political science professor at George Washington University. I met Amitai Etzioni last summer at the birthday party for privacy advocate Marc Rotenberg (director of the Electronic Privacy Information Center) in Washington DC. It was a good discussion, over potluck, followed by a mutual exchange of business cards. A week or so later, Etzioni’s book arrived in the postal mail with a note: “Please accept this publication with compliments of: The Institute for Communitarian Policy Studies. If you wish to consult with Amitai Etzioni, he may be reached at (202) 994-8190.” I pass this along to you my readers in case you need such services. Sending me the book itself, transmitted from his gratisphere to mine, was the best service that Etzioni could have rendered.

“The Limits of Privacy” is a brilliant dissection of current legal, regulatory, policy, economic, and cultural issues surrounding the issue of privacy protection. It was published in 1999 but still feels fresh, thanks in large part to the solid analysis at its core, laying out a clear and practical set of principles for balancing privacy concerns against “common good” imperatives such as public safety and health. The book has been sitting on my dresser for several months now, inviting me to pick it up, glance this or that chapter again, and put it down for future browsing. That’s not a limitation of the tome—it’s a strength—always something there to tickle the cerebral cortex with new insights. I’m surprised it’s taken me this long to blog on it.

Though written for public policy wonks, it is very much a tek-biz book. First off, many of the current privacy controversies that Etzioni discusses revolve around applications of information technology: e.g., strong encryption, biometric identifiers, national ID cards, medical records disclosure procedures, web-based publication of convicted sex offender identities/addresses, etc. Second, he identifies large corporations—or as he calls them, “Big Bucks”—as the primary violators of privacy, and is less concerned about government agencies, which he refers to by the Orwellian “Big Brother.” Consequently, he’s far more concerned about overzealous big-biz interests trampling on people’s privacy—remember, this book was published two years before 9/11, during the first dotcom bubble…so keep that historical context in mind…though his concerns are still valid.

However, Etzioni is no privacy-absolutist libertarian or “cypherpunk” (yes, Kim Cameron, this is a real word/movement…or, at least, was back then in the late great 90s….see p. 97 of Etzioni’s book). In fact, Etzioni takes pains to distance himself from the privacy absolutist camp. He’s a different species entirely: a “communitarian,” who believes that the privacy zealots, though well-meaning, have gotten out of hand and need to be counterbalanced by a co-equal emphasis on compelling “common good” concerns that may, in specific circumstances, justify limits on privacy (hence the title of the book). Per pp 195-196:

  • Etzioni: “Contemporary champions of privacy often still employ arguments that treat privacy as either an unbounded or privileged good….The negative consequences, however, of treating privacy and other individual rights as sacrosanct have been largely ignored by those who draw on legal conceptions fashioned in earlier ages….[In] American society after 1960….[t]he realms of rights, private choice, self-interest, and entitlement were expanded and extended, but corollary social responsibilities and commitments to the common good were neglected, with negative consequences such as the deterioration of public safety and public health. The new sociohistorical context, as we see it, calls for greater dedication to the common good and less expansive privileging to the individual rights.”

Of course, everybody has a different reading on the “sociohistorical context,” and many might say it’s not a “new” context at all, but just the same old tug-of-war between true blue defenders of civil liberties and those who would attempt to limit those precious freedoms under the usual authoritarian pretexts such as the need for “law and order,” a “return to family values,” the “war on terror,” and so forth. Hence, the never-ending stalemate between “libertarians” and “communitarians” (or whatever labels you wish to assign to the polar camps in this culture war).

Lest you think that Etzioni is a far-right-wing absolutist, I urge you to read the book and see how well he balances and nuances his policy analysis and recommendations in his treatment of various privacy controversies. “The Limits of Privacy” offers a practical, sensible, context-sensitive mechanism for identifying necessary privacy limits, consisting of the following criteria (pp. 12-13):

  • FIRST CRITERION (applies only where a threat to society crosses a threshold identified in the criterion): ‘[T]ake steps to limit privacy only if [society] faces a well-documented and macroscopic threat to the common good, not merely a hypothetical danger.”
  • SECOND CRITERION (applies only if the threat specified by the first criterion is identified): “[C]ounter [that threat to society] without first resorting to measures that might restrict privacy.”
  • THIRD CRITERION (applies only if the non-privacy-diminishing measures specified by the second criterion cannot be identified): “Make [privacy-diminishing measures] as minimally intrusive as possible.”
  • FOURTH CRITERION (applies only if the privacy-diminishing measures specified by the third criterion are being evaluated for possible implementation): “[M]easures that treat undesirable side effects of needed privacy-diminishing measures are to be preferred over those that ignore these effects.”

More than just lay out these criteria, Etzioni demonstrates in privacy-relevant case after case how they can be used to clarify the pros and cons of various policy alternatives. Essentially, he’s doing an economics-like trade-off analysis of privacy vs. other important “goods,” implicitly recognizing (some of the forthcoming air quotes inserted by yours truly) that each privacy-protection measure may have a countervailing “opportunity cost” in forgone “common goods,” and may introduce “externalities” that nullify its advantages in whole or part. This approach reflects the fact that privacy protection is part of a never-ending balancing exercise that must consider larger cultural, economic, and geopolitical issues.

Nevertheless, I’m still a bit troubled by the slipperiness of Etzioni’s overarching “sociohistorical context” framework for identifying the proper balancing point for privacy rights. From what I can see, it could be used to justify a radical rewrite of privacy laws/regs toward either a fascistic or anarchistic extreme, on the grounds such a sudden, disruptive measure is necessary to rebalance the sociohistorical equation. Per the Conclusion on p. 215:

  • Etzioni: “Above all, a communitarian approach to privacy avoids the failings of static conceptions by taking into account sociohistorical changes. For example, it recognizes that [if] more privacy is granted from informal social controls in a given period, the more state controls will be necessary in following years to sustain the same level of social order.”

Etzioni follows this statement with an assurance that this approach is necessary for society to avoid either of the ideological extremes, where privacy is concerned. However, his closing statement (to the entire book) seems prone to misinterpretation, stressing as it does the need for “endeavors to ensure that society’s elementary needs for public health and public safety are not neglected.” Many dictators reinforce the legitimacy of their regimes by citing the need to defend the public’s safety and health (physical, economic, moral) from enemies, foreign and domestic—hence, the “law and order” and “spiritual cleansing” justifications for tyranny.

So, careful there, Amitai. Words are swords. Rhetorical edges can double back on those who wield them. Irony, red as passionate prose, sharp as stainless steel.

Jim

Monday, November 19, 2007

ink-impressed tek-biz bound-bulk-pulp review “Revolt in the Boardroom” by Alan Murray

All:

Winnowing down the stack of recent tomes rendered to me by others via the gratispherical outreach arms of their respective career-o-spheres. Free is neat.

Murray’s “Revolt in the Boardroom” is more of a biz-biz than a tek-biz book (but that’s fine…I’m feeling reasonably bizzy at this particular moment). It is a well-researched and compelling discussion of the changing governance practices of corporations in the post-millennium, post-9/11, post-Enron, post-SarbOx era. Written by an assistant managing editor at the Wall Street Journal, the book focuses on the steady weakening of the CEO’s clout and strengthening of corporate boards of directors, who are now far less willing to kowtow (or so Murray argues) to arrogant, authoritarian, corrupt chief executives. The book primarily focuses on US-based corporations, though it hints at being generalizable to a global scale (implicitly, there’s this assumption that America sets the lead for the world at large…which is highly debatable).

This book was one of two that were given to me at SAS Institute’s recent Premier Business Leadership Series Conference (the other book was Davenport/Harris' “Competing on Analytics,” which I reviewed in this blog late last week), an event that was, of course, heavily focused on software for corporate performance management (CPM). To the extent that there’s any tek content in “Revolt in the Boardroom,” other than a detailed discussion of the recent C-suite travails of tek-vendor HP, it’s on page 27, wherein Murray refutes the late John Kenneth Galbraith’s contention (in 1971) that advanced technology is making the economy more supply-push by strengthening corporations’ ability to engineer demand for the output of their factories. Here’s Murray’s rebuttal to JKG, citing the role of information technology in making corporations more demand-pull:

  • Murray: “[I]n fact, a revolution in information technology helped to bring companies much closer to the marketplace, providing them faster access to information on what consumer were demanding, and giving them greater ability to adjust to those demands.”

In the broadest perspective, the book looks at the need for strong corporate governance, risk, and compliance (GRC) management—though it looks at it from a purely business-trends perspective. Murray's discussion pays no attention to how CPM software or other IT solutions can enable more effective GRC measurement and enforcement. That’s not a weakness of the book….just a matter of scoping…indeed, the very final paragraph practically screams for a GRC/CPM/analytics-focused sequel:

  • Murray: “Academics have tried to settle this debate, looking for evidence that ‘good corporate governance’—i.e., an effective check on a CEO’s power—leads to better performance for shareholders. So far, however, the evidence is mixed. In part, the problem is one of definitions. What is ‘good governance’? How do you measure ‘performance’? At the end of the day, the studies are inconclusive. The choice between the old regime and the emerging new one seems to be more a matter of faith and preference than reason or science.”
IMHO, Murray is throwing in the towel prematurely on this critical issue. He’s implying that the justification for good governance is purely intuitive and qualitative. In fact, it’s way too important to leave purely to the warm and squishies. I’d like to see a follow-on entitled “Complying on Analytics”? Davenport and Harris—opportunity for you!

Also, reading through this book, it’s not clear to me what emerging new “regime” Murray’s referring to. He doesn’t conclusively demonstrate any enduring restructuring of the institutional basis for governance of public corporations in the USA or anywhere else. All he points to are a “new CEO” (translation: fresh batch of new folks in those positions are who are slightly less arrogant, more collegial, and more broadly stakeholder-focused than the bunch they’re succeeding) and a “new power elite” (translation: greater, albeit still minuscule, representation of pension funds, shareholder advisory services, social activists, hedge funds, and nongovernmental organizations on corporate boards of directors).

But this “new order” is just a matter of the latest transient swing in the corporate culture, responding to recent events in the economic, regulatory, and political arenas. This so-called “democratization” of corporate governance (activist boards!) can easily swing back to a preference for autocratic leaders (visionary CEOs!) once we get some fresh, charismatic new movers and shakers in the C-suites of this world.

We compete on a global scale. Chinese regimes, for example, are not known for C-suite transparency.

Jim

Friday, November 16, 2007

ink-impressed tek-biz bound-bulk-pulp review “Competing on Analytics” by Thomas H. Davenport and Jeanne G. Harris

All:

Winning is a “science” now, or so says the subtitle of this new book. Funny, I thought winning was an art—or, rather, a result to be sought through art, science, dumb luck, karma, magic, good genes, treachery, God’s grace, or what have you.

Regardless, winning is adaptive success, and adaptation through natural/competitive (and/or engineered) selection is what drives evolution, and there is some science (i.e., a systematic, fact-based, collaborative inquiry into basic principles, descriptive and predictive) behind our belief that evolution is how life in all its crazy heterogeneity continues to cultivate God’s green Earth, so I’ll grant them this word/concept in this context.

Actually, let me take this opportunity to spell out my core definition of “science,” and then map it into Davenport/Harris’ discussion of how analytics supports a science-like approach under which humans manage to tighten and hopefully brighten our stewardship over this planetary inheritance.

I actually addressed this matter indirectly on July 12 of this year, in this blog, under the seemingly endless (though only two month) “Ocean Semantic” thread. Buried in an extremely long shapeless run-on paragraph near the end of that thread, and couched in the context of a gratuitously erudite observation on Kant’s metaphysics, here’s how I defined “science”: a “process of progressive societal construction of an interlinking system of empirically verifiable statements through the building and testing of interpretive frameworks via controlled observation.”

The key concept here is “controlled observation,” and, in particular, the notion of appropriate controls on (empirical) observations. Pretty much everybody agrees that the key controls on scientific investigations--in order to “build and test interpretative frameworks,” i.e., construct and confirm hypotheses—should be some combination of analytical, logical, mathematical, statistical, experimental, demonstration/replication, independent verification, peer review, and other methods, procedures, checkpoints, and so forth. Some controls are more appropriate and feasible for some branches of scientific investigation than in others (e.g., you can do controlled, laboratory, experimental verification in organic chemistry more readily than in astrophysics). Such fact-based controls are designed to drive the decision to confirm or not confirm hypotheses, or disprove, qualify, or constrain established theorems.

Getting now to “Competing on Analytics: The New Science of Winning,” Davenport/Harris define their core concept, “analytics,” as referring to “extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions.” Clearly, what they’re describing is essentially an application of scientific practices to practical matters: solving business problems. That’s cool…science done in business suits is just as valid as in lab coats….and maybe more useful where it truly counts: creating sustainable value, generating wealth, and contributing to human happiness in some small way.

The book is an excellent discussion of how enterprises can compete through smart application of statistical analysis, predictive modeling, data/text mining, simulation, business intelligence (BI), corporate performance management (CPM), online analytical processing (OLAP), data warehousing, data cleansing, expert system, rules engine, interactive visualization, spreadsheets, and other applications and tools that once, in the prehistoric days before I entered the industry in the mid-80s, were often lumped under the heading of “decision support systems” (DSS). It’s no surprise that I received the book as a freebie for attending a recent conference sponsored by SAS Institute, which was not only a pioneering vendor in DSS starting in the mid-70s, but of course remains a powerhouse in BI, CPM, data mining, statistical analysis, predictive modeling, visualization, and many of the other DSS-ish technologies I just enumerated (thanks SAS!). The book is chock full of excellent case studies of companies in many industries that have differentiated themselves, notched impressive ROI, and competed effectively through DSS-ish analytics technologies—and also by cultivating analytics-driven cultures that are spearheaded by CEOs who got analytics religion.

Analytics, analysis, and analysts truly rule…that’s for sure…I’m an analyst, so of course this resonates…and this book is a very handy set of guidelines for organizations that want to leverage their BI and other analytics investments into sustainable competitive advantage. For purely personal reasons, one of the things I noticed while reading this book is that Davenport/Harris twice give kudos to Peter G.W. Keen, who in the mid-70s, as an academic, helped pioneer/popularize the concept of DSS. The reason I say “personal” is because Peter G.W. Keen, in the mid-80s, as president of the short-lived MCI-funded DC-based quasi-analyst-firm International Center for Information Technologies, hired James Kobielus as a research associate…an experience that lead to, among other things, my still-going stint as a contributing editor/pundit for Network World (though it actually wasn’t my first “analyst” job….that was actually an internship in the summer of 1979, between my junior and senior years in college, at an urban coalition, New Detroit Inc., as a policy analyst, trying to help that city, near which I grew up, recover and rebuild from its sad decline…but I digress). Closing the loop on Keen, when I first picked up Davenport/Harris’ book (but before opening the cover), I thought to myself: “hmmm…’Competing on Analytics’….somehow, it reminds of the title of Keen’s ‘Competing in Time’ book, which was published during my ICIT stint….hmmm….”

Anyway, one of many things I like about Davenport/Harris’ book is their nuanced discussion of the proper roles of analytics vs. intuition in business decisions, and of the roles of automated analytic tools vs. human analysts (on the latter….whew, I thought….at least they recognize an ongoing role for the likes of me and my kind….maybe we don’t have to surrender our wetware completely to the gratisphere just yet…John Henry was a model-hammerin’ man…..). My favorite excerpt (pp. 131-132): “A few years ago, we began hearing extravagant tales of software that would eliminate the need for human analysts….While data mining software is a wonderful thing, a smart human still needs to interpret the patterns that are identified, decide which patterns merit validation or subsequent confirmation, and translate new recommendations for action. Other smart humans need to actually take action.”

Another key take-away for me from this book is that professional analysts—i.e., predictive model builders, who power those analytical engines models with structured data, deep domain expertise, and statistical algorithms—can only accomplish so much if the organizations that employ them are captive to bad business models. From page 55: “[One of the things that has] kept [American and United Airlines] from succeeding with their analytical strategies….is that their analytics support an obsolete business model. They pioneered analytics for yield management, but other airlines with lower costs can still offer lower prices (on average, if not for a particular seat). They pioneered analytics for complex optimization of routes with many different airplane types, but competitors such as Southwest save both money and complexity by using only one type of plane. They pioneered loyalty programs and promotions based on data analysis, but their customer service is so indifferent that loyalty to these airlines is difficult for frequent flyers.”

In other words, to extend the airline metaphor, the human analysts are like the navigators in the cockpit. They are totally on top of every data point surfaced through radar, instrumentation, etc. But they are essentially captive to the decisions made by the genius sitting in the pilot’s seat.

Somehow, my mind goes back to the movie “Airplane,” when, after pilot Peter Graves was felled by food poisoning, flight attendant Julie Hagerty got on the intercom and asked the passengers: “Excuse me, there’s no cause for alarm, but does anybody back there know how to fly an airplane?”

Jim