Friday, December 21, 2007

forest-or-trees Expert Predictions for 2008

All:

Wow...quite a year....while Chris Butler and the Waitresses perform "Christmas Wrapping" on www.kexp.org and Jason behind me taps out the first draft of the play he'll also direct a couple of months from now, let me Janus-wise opine on the year gone by and the year to come....'twas fine of Jeff Kelly to spur this forest-or-trees roundup of my bread-and-butter (and has it really been three years since the Boxing Day Tsunami?):

**************************

BUSINESS INTELLIGENCE (BI)

  • BI becoming SOA’s crown jewel: The past year has seen a rash of headline-grabbing M&A deals in the BI arena, with Oracle’s acquisition of Hyperion, SAP’s deal for Business Objects, and IBM’s pending takeover of Cognos—not to mention acquisitions of smaller BI and corporate performance management (CPM) application vendors by most of those firms. It’s far too easy to misinterpret these recent events as just more of the same M&A-stoked empire-building that we’ve come to expect from large IT solution vendors. What’s driving this recent industry consolidation—which is sure to continue in 2008--is growing vendor recognition that BI is the crown jewel in any comprehensive service-oriented architecture (SOA) solution portfolio. Though Oracle and SAP (and, to a lesser degree, IBM) already had decent BI wares in their respective SOA portfolios, none of them were on any enterprise’s short list of name-brand BI solution providers—until, that is, each of them decided to grab a leading BI pure-play. SOA suites cannot be considered feature-complete unless they incorporate a comprehensive range of BI features.
  • BI evolving into tailored business analytics: CPM—sometimes called “business analytics”—is rapidly becoming a key competitive front in the BI wars. Increasingly, BI/CPM vendors are offering tailored solutions for a dizzying range of horizontal business requirements and vertical industries. Vendors’ continued profitability also hinges on their ability to provide the professional services necessary to create, customize, and support business analytics for each vertical industry’s and specific customer’s unique requirements. Without a doubt, we’ll see further verticalization of product and service offerings by CPM vendors in 2008, which will provide a necessary hedge against the inevitable creep of commoditization into such horizontal analytics segments as financial, human resources, sales and marketing, and supply chain management.
  • BI going truly real-time through complex event processing: Complex event processing (CEP) promises business agility through continuous correlation and visualization of multiple event-streams. However, CEP has heretofore been conspicuously missing from the mainstream BI arena, necessitating stovepipe CEP implementations that are only loosely integrated with enterprises’ existing visualization, reporting, dashboarding, information modeling, metadata, and other BI infrastructure components. That will change big-time in 2008, as most leading BI vendors start to partner with CEP pure-plays, or acquire them outright, in order to strengthen their support for real-time event-driven applications. We expect to see SAP/Business Objects, IBM/Cognos, Oracle/Hyperion, SAS Institute, Microsoft, Information Builders, and MicroStrategy venture into the CEP arena in the coming year. Likewise, it’s very likely that the newly independent Teradata, which has taken the lead in real-time data warehousing (DW), will snatch up a CEP vendor to build out its real-time BI portfolio.
  • BI bundling with DW appliances: Appliances have even begun to take up permanent residence at the heart of the enterprise data center: in the DW and BI infrastructures. Increasingly, vendors are focusing on integrating, packaging, and pricing their DW/BI products as pre-configured, modular appliances for quick deployment. These appliances consist of processing, storage, and software components that have been prepackaged, preconfigured, and pre-optimized for core DW/BI functions such as multidimensional online analytical processing (OLAP) queries, bulk data loading, and online archiving. The past year saw a growing range of DW vendors—including such DBMS powerhouses as IBM, Oracle, and Microsoft—reorient their DW/BI go-to-market strategies around the appliance model. In turn, leading BI vendors such as Business Objects and Cognos made a big push into the appliance arena. In 2008 and beyond, more and more DW vendors will pre-integrate BI solutions—their own and/or those of their partners—into their appliances. Increasingly, DW/BI appliances will be tailored, packaged, and priced for many market segments and deployment scenarios.
  • BI goes collaborative: Collective intelligence is an organization’s most precious asset. Traditionally, the BI industry has offered little to directly address one of the most critical components of group IQ: the collaboration environment. Instead, most BI applications focus on delivering targeted reports, analytics, dashboards, multidimensional visualization, and other key data to individual end users in isolation, rather than to larger business teams. In the past year, though, the BI industry has begun to roll out more collaboration features in their products—such as Microsoft with their new Office PerformancePoint Server 2007 solutin--or, at the very least, to begin talking about new collaboration features to expect in the coming year. In 2008 and beyond, we expect to see the BI, collaboration, and knowledge management segments converge. Likewise, we expect to see such interactive Web 2.0 technologies as AJAX, blogs, wikis, and social networking revolutionize the BI experience. Many BI vendors now realize that decision support environments should allow users to access intelligence wherever it may reside, be it in data warehouses or in the heads of remote colleagues.

MASTER DATA MANAGEMENT (MDM)

  • MDM vendors consolidate: Recent M&A activity—such as SAP/Business Objects, Oracle/Hyperion, IBM/Cognos, and Microsoft/Stratature--can be viewed as driven by vendors’ need to assemble more comprehensive solution portfolios to manage diverse master data sets and feed them into enterprise BI/CPM environments. In 2008, vendors will--through strategic acquisitions, partnerships, and internal development--assemble MDM solution portfolios that encompass best-of-breed solution elements in data integration (DI), data quality (DQ), DW, DBMSs, cross-catalog hierarchy management, pre-built domain models, data modeling and mapping, and data governance (DG). We expect to see such leading MDM pure-plays as Siperian, Initiate Systems, and Kalido be acquired by larger vendors looking to build up their MDM portfolios.
  • MDM vendors converge their platforms: Some of the recent industry consolidations have brought together former rivals who each have their own MDM solution portfolios that will need to be converged in 2008 and beyond onto a common platform. IBM, which acquired two MDM vendors in the past few years, has already pre-announced a converged new MDM solution that will be generally available in the first quarter of 2008. Oracle, which acquired Hyperion’s financial data-hub MDM solution in 2007, is likely to converge that offering with its pre-existing customer data integration (CDI) and product information management (PIM) MDM offerings on the Fusion Middleware platform in 2008. Likewise, we expect to see SAP begin to bring some important MDM-enabling Business Objects technology—especially strong data profiling and cleansing—into its established NetWeaver MDM offering.
  • MDM vendors differentiate through prepackaged solution accelerators: MDM projects are often complex, costly, and time-consuming. Recognizing this barrier to user adoption, vendors have increasingly sought to lower total cost of ownership through prepackaged MDM solution accelerators, sometimes known as domain models or templates—for CDI, PIM, and various vertical application domains. The leading MDM vendors—such as IBM and Oracle—rolled out and enhanced solution accelerators in 2007, and this trend will extend to all vendors, large and small, in 2008 and beyond. These solution accelerators consist of packaged master data definitions, schemas, models and objects, plus data governance infrastructure necessary to tailor an MDM or DW environment for a particular horizontal application (such as CDI, PIM, and financial consolidation) or vertical, industry-specific deployment (such as retailing, financial services, consumer packaged goods or healthcare).
  • MDM becoming more of a services than a product market: Even with prepackaged solution accelerators, there is no such thing as a truly shrink-wrapped, plug-and-play MDM solution. In practical terms, MDM often refers to a target enterprise architecture that can be complex, costly and difficult to implement and administer. Consequently, enterprises must engage professional services organizations to guide their MDM projects and operations every step of the way. In 2007, MDM solution vendors continued to build out their own consulting and systems integration (SI) organizations and to cultivate their partnerships with leading professional services firms that have expertise in CDI, PIM, and other key MDM applications. We expect to see this trend accelerate in 2008 and beyond, as professional services leads the way in most vendors’ MDM go-to-market strategies, and as these domain experts become the primary content providers to vendors’ precious MDM solution accelerators.
  • MDM deployments become more decentralized and virtualized: Traditionally, MDM has been deployed in a centralized fashion around the enterprise DW. However, 2007 saw more and more vendors stress more decentralized, virtualized deployment models for MDM, such as IBM with its focus on “multi-style, multi-form, multidomain” deployments. We expect this trend accelerate in 2008, as vendors respond to users’ demands for life-cycle management of master data sets across federated environments. Increasingly, users are requiring flexible MDM environments that allow them to deploy master data sets in centralized or federated topologies, while retaining unified, SOA-based DG across their enterprise service bus.
**************************

Next up, new vistas in Data Warehousing. More on that in the next few.

Jim

Tuesday, November 20, 2007

ink-impressed tek-biz bound-bulk-pulp review “The Limits of Privacy” by Amitai Etzioni

All:

With this present post, I’ve now reduced the still-gotta-finish-this-book stack to an illustrated history of Peanuts (key take-away: this classic 50-year strip hit a new plateau when Snoopy, in the late 50s, started to stand on his hind legs, dance, verbalize, and serve as wild-card for any flight of Schulz’ imagination); a detailed history of Fairfax County, Virginia (key take-away: the site where my wife and I work out was, before the Civil War, a large day-laborer market for freed slaves and for slaveholders seeking to rent out their chattel); and Joseph Campbell’s “Myths to Live By” (key take-away: he was a devotee of Carl Jung, and he didn’t much care for hippies). No…I won’t blog further on those titles (or will I?).

Now to the reviewed book of the day. “The Limits of Privacy” came free from the author, a political science professor at George Washington University. I met Amitai Etzioni last summer at the birthday party for privacy advocate Marc Rotenberg (director of the Electronic Privacy Information Center) in Washington DC. It was a good discussion, over potluck, followed by a mutual exchange of business cards. A week or so later, Etzioni’s book arrived in the postal mail with a note: “Please accept this publication with compliments of: The Institute for Communitarian Policy Studies. If you wish to consult with Amitai Etzioni, he may be reached at (202) 994-8190.” I pass this along to you my readers in case you need such services. Sending me the book itself, transmitted from his gratisphere to mine, was the best service that Etzioni could have rendered.

“The Limits of Privacy” is a brilliant dissection of current legal, regulatory, policy, economic, and cultural issues surrounding the issue of privacy protection. It was published in 1999 but still feels fresh, thanks in large part to the solid analysis at its core, laying out a clear and practical set of principles for balancing privacy concerns against “common good” imperatives such as public safety and health. The book has been sitting on my dresser for several months now, inviting me to pick it up, glance this or that chapter again, and put it down for future browsing. That’s not a limitation of the tome—it’s a strength—always something there to tickle the cerebral cortex with new insights. I’m surprised it’s taken me this long to blog on it.

Though written for public policy wonks, it is very much a tek-biz book. First off, many of the current privacy controversies that Etzioni discusses revolve around applications of information technology: e.g., strong encryption, biometric identifiers, national ID cards, medical records disclosure procedures, web-based publication of convicted sex offender identities/addresses, etc. Second, he identifies large corporations—or as he calls them, “Big Bucks”—as the primary violators of privacy, and is less concerned about government agencies, which he refers to by the Orwellian “Big Brother.” Consequently, he’s far more concerned about overzealous big-biz interests trampling on people’s privacy—remember, this book was published two years before 9/11, during the first dotcom bubble…so keep that historical context in mind…though his concerns are still valid.

However, Etzioni is no privacy-absolutist libertarian or “cypherpunk” (yes, Kim Cameron, this is a real word/movement…or, at least, was back then in the late great 90s….see p. 97 of Etzioni’s book). In fact, Etzioni takes pains to distance himself from the privacy absolutist camp. He’s a different species entirely: a “communitarian,” who believes that the privacy zealots, though well-meaning, have gotten out of hand and need to be counterbalanced by a co-equal emphasis on compelling “common good” concerns that may, in specific circumstances, justify limits on privacy (hence the title of the book). Per pp 195-196:

  • Etzioni: “Contemporary champions of privacy often still employ arguments that treat privacy as either an unbounded or privileged good….The negative consequences, however, of treating privacy and other individual rights as sacrosanct have been largely ignored by those who draw on legal conceptions fashioned in earlier ages….[In] American society after 1960….[t]he realms of rights, private choice, self-interest, and entitlement were expanded and extended, but corollary social responsibilities and commitments to the common good were neglected, with negative consequences such as the deterioration of public safety and public health. The new sociohistorical context, as we see it, calls for greater dedication to the common good and less expansive privileging to the individual rights.”

Of course, everybody has a different reading on the “sociohistorical context,” and many might say it’s not a “new” context at all, but just the same old tug-of-war between true blue defenders of civil liberties and those who would attempt to limit those precious freedoms under the usual authoritarian pretexts such as the need for “law and order,” a “return to family values,” the “war on terror,” and so forth. Hence, the never-ending stalemate between “libertarians” and “communitarians” (or whatever labels you wish to assign to the polar camps in this culture war).

Lest you think that Etzioni is a far-right-wing absolutist, I urge you to read the book and see how well he balances and nuances his policy analysis and recommendations in his treatment of various privacy controversies. “The Limits of Privacy” offers a practical, sensible, context-sensitive mechanism for identifying necessary privacy limits, consisting of the following criteria (pp. 12-13):

  • FIRST CRITERION (applies only where a threat to society crosses a threshold identified in the criterion): ‘[T]ake steps to limit privacy only if [society] faces a well-documented and macroscopic threat to the common good, not merely a hypothetical danger.”
  • SECOND CRITERION (applies only if the threat specified by the first criterion is identified): “[C]ounter [that threat to society] without first resorting to measures that might restrict privacy.”
  • THIRD CRITERION (applies only if the non-privacy-diminishing measures specified by the second criterion cannot be identified): “Make [privacy-diminishing measures] as minimally intrusive as possible.”
  • FOURTH CRITERION (applies only if the privacy-diminishing measures specified by the third criterion are being evaluated for possible implementation): “[M]easures that treat undesirable side effects of needed privacy-diminishing measures are to be preferred over those that ignore these effects.”

More than just lay out these criteria, Etzioni demonstrates in privacy-relevant case after case how they can be used to clarify the pros and cons of various policy alternatives. Essentially, he’s doing an economics-like trade-off analysis of privacy vs. other important “goods,” implicitly recognizing (some of the forthcoming air quotes inserted by yours truly) that each privacy-protection measure may have a countervailing “opportunity cost” in forgone “common goods,” and may introduce “externalities” that nullify its advantages in whole or part. This approach reflects the fact that privacy protection is part of a never-ending balancing exercise that must consider larger cultural, economic, and geopolitical issues.

Nevertheless, I’m still a bit troubled by the slipperiness of Etzioni’s overarching “sociohistorical context” framework for identifying the proper balancing point for privacy rights. From what I can see, it could be used to justify a radical rewrite of privacy laws/regs toward either a fascistic or anarchistic extreme, on the grounds such a sudden, disruptive measure is necessary to rebalance the sociohistorical equation. Per the Conclusion on p. 215:

  • Etzioni: “Above all, a communitarian approach to privacy avoids the failings of static conceptions by taking into account sociohistorical changes. For example, it recognizes that [if] more privacy is granted from informal social controls in a given period, the more state controls will be necessary in following years to sustain the same level of social order.”

Etzioni follows this statement with an assurance that this approach is necessary for society to avoid either of the ideological extremes, where privacy is concerned. However, his closing statement (to the entire book) seems prone to misinterpretation, stressing as it does the need for “endeavors to ensure that society’s elementary needs for public health and public safety are not neglected.” Many dictators reinforce the legitimacy of their regimes by citing the need to defend the public’s safety and health (physical, economic, moral) from enemies, foreign and domestic—hence, the “law and order” and “spiritual cleansing” justifications for tyranny.

So, careful there, Amitai. Words are swords. Rhetorical edges can double back on those who wield them. Irony, red as passionate prose, sharp as stainless steel.

Jim

Monday, November 19, 2007

ink-impressed tek-biz bound-bulk-pulp review “Revolt in the Boardroom” by Alan Murray

All:

Winnowing down the stack of recent tomes rendered to me by others via the gratispherical outreach arms of their respective career-o-spheres. Free is neat.

Murray’s “Revolt in the Boardroom” is more of a biz-biz than a tek-biz book (but that’s fine…I’m feeling reasonably bizzy at this particular moment). It is a well-researched and compelling discussion of the changing governance practices of corporations in the post-millennium, post-9/11, post-Enron, post-SarbOx era. Written by an assistant managing editor at the Wall Street Journal, the book focuses on the steady weakening of the CEO’s clout and strengthening of corporate boards of directors, who are now far less willing to kowtow (or so Murray argues) to arrogant, authoritarian, corrupt chief executives. The book primarily focuses on US-based corporations, though it hints at being generalizable to a global scale (implicitly, there’s this assumption that America sets the lead for the world at large…which is highly debatable).

This book was one of two that were given to me at SAS Institute’s recent Premier Business Leadership Series Conference (the other book was Davenport/Harris' “Competing on Analytics,” which I reviewed in this blog late last week), an event that was, of course, heavily focused on software for corporate performance management (CPM). To the extent that there’s any tek content in “Revolt in the Boardroom,” other than a detailed discussion of the recent C-suite travails of tek-vendor HP, it’s on page 27, wherein Murray refutes the late John Kenneth Galbraith’s contention (in 1971) that advanced technology is making the economy more supply-push by strengthening corporations’ ability to engineer demand for the output of their factories. Here’s Murray’s rebuttal to JKG, citing the role of information technology in making corporations more demand-pull:

  • Murray: “[I]n fact, a revolution in information technology helped to bring companies much closer to the marketplace, providing them faster access to information on what consumer were demanding, and giving them greater ability to adjust to those demands.”

In the broadest perspective, the book looks at the need for strong corporate governance, risk, and compliance (GRC) management—though it looks at it from a purely business-trends perspective. Murray's discussion pays no attention to how CPM software or other IT solutions can enable more effective GRC measurement and enforcement. That’s not a weakness of the book….just a matter of scoping…indeed, the very final paragraph practically screams for a GRC/CPM/analytics-focused sequel:

  • Murray: “Academics have tried to settle this debate, looking for evidence that ‘good corporate governance’—i.e., an effective check on a CEO’s power—leads to better performance for shareholders. So far, however, the evidence is mixed. In part, the problem is one of definitions. What is ‘good governance’? How do you measure ‘performance’? At the end of the day, the studies are inconclusive. The choice between the old regime and the emerging new one seems to be more a matter of faith and preference than reason or science.”
IMHO, Murray is throwing in the towel prematurely on this critical issue. He’s implying that the justification for good governance is purely intuitive and qualitative. In fact, it’s way too important to leave purely to the warm and squishies. I’d like to see a follow-on entitled “Complying on Analytics”? Davenport and Harris—opportunity for you!

Also, reading through this book, it’s not clear to me what emerging new “regime” Murray’s referring to. He doesn’t conclusively demonstrate any enduring restructuring of the institutional basis for governance of public corporations in the USA or anywhere else. All he points to are a “new CEO” (translation: fresh batch of new folks in those positions are who are slightly less arrogant, more collegial, and more broadly stakeholder-focused than the bunch they’re succeeding) and a “new power elite” (translation: greater, albeit still minuscule, representation of pension funds, shareholder advisory services, social activists, hedge funds, and nongovernmental organizations on corporate boards of directors).

But this “new order” is just a matter of the latest transient swing in the corporate culture, responding to recent events in the economic, regulatory, and political arenas. This so-called “democratization” of corporate governance (activist boards!) can easily swing back to a preference for autocratic leaders (visionary CEOs!) once we get some fresh, charismatic new movers and shakers in the C-suites of this world.

We compete on a global scale. Chinese regimes, for example, are not known for C-suite transparency.

Jim

Friday, November 16, 2007

ink-impressed tek-biz bound-bulk-pulp review “Competing on Analytics” by Thomas H. Davenport and Jeanne G. Harris

All:

Winning is a “science” now, or so says the subtitle of this new book. Funny, I thought winning was an art—or, rather, a result to be sought through art, science, dumb luck, karma, magic, good genes, treachery, God’s grace, or what have you.

Regardless, winning is adaptive success, and adaptation through natural/competitive (and/or engineered) selection is what drives evolution, and there is some science (i.e., a systematic, fact-based, collaborative inquiry into basic principles, descriptive and predictive) behind our belief that evolution is how life in all its crazy heterogeneity continues to cultivate God’s green Earth, so I’ll grant them this word/concept in this context.

Actually, let me take this opportunity to spell out my core definition of “science,” and then map it into Davenport/Harris’ discussion of how analytics supports a science-like approach under which humans manage to tighten and hopefully brighten our stewardship over this planetary inheritance.

I actually addressed this matter indirectly on July 12 of this year, in this blog, under the seemingly endless (though only two month) “Ocean Semantic” thread. Buried in an extremely long shapeless run-on paragraph near the end of that thread, and couched in the context of a gratuitously erudite observation on Kant’s metaphysics, here’s how I defined “science”: a “process of progressive societal construction of an interlinking system of empirically verifiable statements through the building and testing of interpretive frameworks via controlled observation.”

The key concept here is “controlled observation,” and, in particular, the notion of appropriate controls on (empirical) observations. Pretty much everybody agrees that the key controls on scientific investigations--in order to “build and test interpretative frameworks,” i.e., construct and confirm hypotheses—should be some combination of analytical, logical, mathematical, statistical, experimental, demonstration/replication, independent verification, peer review, and other methods, procedures, checkpoints, and so forth. Some controls are more appropriate and feasible for some branches of scientific investigation than in others (e.g., you can do controlled, laboratory, experimental verification in organic chemistry more readily than in astrophysics). Such fact-based controls are designed to drive the decision to confirm or not confirm hypotheses, or disprove, qualify, or constrain established theorems.

Getting now to “Competing on Analytics: The New Science of Winning,” Davenport/Harris define their core concept, “analytics,” as referring to “extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions.” Clearly, what they’re describing is essentially an application of scientific practices to practical matters: solving business problems. That’s cool…science done in business suits is just as valid as in lab coats….and maybe more useful where it truly counts: creating sustainable value, generating wealth, and contributing to human happiness in some small way.

The book is an excellent discussion of how enterprises can compete through smart application of statistical analysis, predictive modeling, data/text mining, simulation, business intelligence (BI), corporate performance management (CPM), online analytical processing (OLAP), data warehousing, data cleansing, expert system, rules engine, interactive visualization, spreadsheets, and other applications and tools that once, in the prehistoric days before I entered the industry in the mid-80s, were often lumped under the heading of “decision support systems” (DSS). It’s no surprise that I received the book as a freebie for attending a recent conference sponsored by SAS Institute, which was not only a pioneering vendor in DSS starting in the mid-70s, but of course remains a powerhouse in BI, CPM, data mining, statistical analysis, predictive modeling, visualization, and many of the other DSS-ish technologies I just enumerated (thanks SAS!). The book is chock full of excellent case studies of companies in many industries that have differentiated themselves, notched impressive ROI, and competed effectively through DSS-ish analytics technologies—and also by cultivating analytics-driven cultures that are spearheaded by CEOs who got analytics religion.

Analytics, analysis, and analysts truly rule…that’s for sure…I’m an analyst, so of course this resonates…and this book is a very handy set of guidelines for organizations that want to leverage their BI and other analytics investments into sustainable competitive advantage. For purely personal reasons, one of the things I noticed while reading this book is that Davenport/Harris twice give kudos to Peter G.W. Keen, who in the mid-70s, as an academic, helped pioneer/popularize the concept of DSS. The reason I say “personal” is because Peter G.W. Keen, in the mid-80s, as president of the short-lived MCI-funded DC-based quasi-analyst-firm International Center for Information Technologies, hired James Kobielus as a research associate…an experience that lead to, among other things, my still-going stint as a contributing editor/pundit for Network World (though it actually wasn’t my first “analyst” job….that was actually an internship in the summer of 1979, between my junior and senior years in college, at an urban coalition, New Detroit Inc., as a policy analyst, trying to help that city, near which I grew up, recover and rebuild from its sad decline…but I digress). Closing the loop on Keen, when I first picked up Davenport/Harris’ book (but before opening the cover), I thought to myself: “hmmm…’Competing on Analytics’….somehow, it reminds of the title of Keen’s ‘Competing in Time’ book, which was published during my ICIT stint….hmmm….”

Anyway, one of many things I like about Davenport/Harris’ book is their nuanced discussion of the proper roles of analytics vs. intuition in business decisions, and of the roles of automated analytic tools vs. human analysts (on the latter….whew, I thought….at least they recognize an ongoing role for the likes of me and my kind….maybe we don’t have to surrender our wetware completely to the gratisphere just yet…John Henry was a model-hammerin’ man…..). My favorite excerpt (pp. 131-132): “A few years ago, we began hearing extravagant tales of software that would eliminate the need for human analysts….While data mining software is a wonderful thing, a smart human still needs to interpret the patterns that are identified, decide which patterns merit validation or subsequent confirmation, and translate new recommendations for action. Other smart humans need to actually take action.”

Another key take-away for me from this book is that professional analysts—i.e., predictive model builders, who power those analytical engines models with structured data, deep domain expertise, and statistical algorithms—can only accomplish so much if the organizations that employ them are captive to bad business models. From page 55: “[One of the things that has] kept [American and United Airlines] from succeeding with their analytical strategies….is that their analytics support an obsolete business model. They pioneered analytics for yield management, but other airlines with lower costs can still offer lower prices (on average, if not for a particular seat). They pioneered analytics for complex optimization of routes with many different airplane types, but competitors such as Southwest save both money and complexity by using only one type of plane. They pioneered loyalty programs and promotions based on data analysis, but their customer service is so indifferent that loyalty to these airlines is difficult for frequent flyers.”

In other words, to extend the airline metaphor, the human analysts are like the navigators in the cockpit. They are totally on top of every data point surfaced through radar, instrumentation, etc. But they are essentially captive to the decisions made by the genius sitting in the pilot’s seat.

Somehow, my mind goes back to the movie “Airplane,” when, after pilot Peter Graves was felled by food poisoning, flight attendant Julie Hagerty got on the intercom and asked the passengers: “Excuse me, there’s no cause for alarm, but does anybody back there know how to fly an airplane?”

Jim

Thursday, November 15, 2007

ink-impressed tek-biz bound-bulk-pulp review “Wikinomics” by Don Tapscott and Anthony D. Wiliams

All:


“Wiki” is a fun and funny-sounding new coinage (for one thing, it’s got “icky” inside; for another, it’s a Hawaiian-ish hula-skirted adaptation of the English “quick”; for a third, it suggests the classic Florida kitsch of mermaid water park “Wickee Wachee”; for a fourth, it conforms to the klassic borscht belt diktum that hard “k” sounds kill ‘em every time up in Poughkeepsie, Schenectady, and Skaneateles, and out in Kankakee, Kokomo, and Kalamazoo). As a prefix, “wiki” combines nicely with many older words (such as the “nomics” Greek-derived rump end of “economics,” which hasn’t recovered from the semantic abuse it has endured ever since it got lashed into the idiotic ideology-driven ersatz-scientific term “Reaganomics” more than a quarter-century ago).

Wiki rocks and rules these days. So I totally understand why Tapscott and Williams worked it into the title of their book (note that for the rest of this post, I’ll refer to Tapscott alone as the prime author, since he’s obviously driving the show on this). Though I prefer the term “mass collaboration” in their subtitle, because it more accurately describes the concept they put forth. And “mass collaboration” sounds less trendy (the term “wiki” feels so 2007, just like “groovy” is so very 1967, hence so very likely, in just a year or two, to be an embarrassing reminder of our cultural weakness (or is it a strength?), for gratuitous ad-hoc linguistic invention (a weakness I also share, as will be demonstrated once again in just a sec).

That said, “Wikinomics” is a highly readable and research-driven book, and I recommend it to anyone who wants a good overview of new frontiers in the virtualization of the world economy. In the book, Tapscott does a fine job both of laying out the core organizing principles of this new competitive environment (“openness, peering, sharing, and acting globally”) and explaining how these principles are expressed in several ongoing developments (wikis, blogosphere, social networking, open source software, online idea marketplaces, customer-driven product hacks, virtual scientific communities, externally extensible open platforms, collaborative B2B value chains, hypermatrixed enterprise teaming environments). Though I find some of Tapscott’s coinages awkward or semi-opaque (e.g., “ideagoras,” “prosumers, “new Alexandrians,” “platforms for participation”), he is at least trying to give us a new vocabulary that fits these emerging phenomena. Of course, “wiki” is opaque in its own way, and “blogosphere” still sounds strange, but they’ve achieved currency, so maybe it’s a matter of the poetics of the coinages somehow fitting the cultural moment.

Though I enjoyed “Wikinomics,” it leaves me slightly queasy, because it glosses over one of the most important issues in this new economic order: how difficult it’s becoming to make money in a world where everybody is giving everything away for free. For IT industry analysts/pundits/authors such as myself, this hits home in the most visceral way. Any of us who has a blog faces the same challenge every time we post: deciding which thoughts should be published for free to all comers, and which should be reserved for paying customers. Or, if we choose to publish everything we learn/think, we must decide how we can leverage our visibility, reputation, analytical chops, etc. into paying gigs (e.g., consulting, speaking, etc.). Or, if that paying gig (i.e., an actual job-job) is secure, we must decide how far to go with the blogging, podcasting, etc so as to complement and not compromise that kritical kash konnection (anybody who has premium licensed access to my Data Management module at Current Analysis will notice that I go very deep, and very prosaic, and quite prolific on all things BI, CPM, DW, DI, DQ, MDM, GRC, etc….and there’s only a tiny content overlap with what I put in this blog, or voice on the Dana Gardner et al podcasts….that’s by design….it’s the same guy doing it all, but presenting different sides of Jim Kobielus…the blog is my outlet for coloring outside the lines…and occasionally for personal junk like the poems….and gratuitous wordplay of my own devising). Triki-triki, dis nu wiki-wiki.

These general thoughts occurred to me while seeing Tapscott present on this topic in the keynote at the recent Business Objects conference in Orlando (in which I participated in my core job-job role as an analyst for Current Analysis). That was where I, like all other attendees, received our own komplimentary kopies of “Wikinomics” (and a great laptop-ready tote bag—thanks Business Objects!). Tapscott did a fine job discussing the topic, and illustrating it with engaging slides and videos, talking about the wonderful new economy that is fueled by everybody giving it all away for no charge. Of course, Tapscott mentioned the Radiohead Gambit (which I saw mentioned yet again in the Wall Street Journal this morning—that paper says that around 60 percent of downloaders of “In Rainbows” are paying squat—Messrs. Yorke et al. dispute that number, but don’t reveal the actual). And Tapscott trumpeted his own “Wikinomics” book as another example of this trend, in the literary sphere, stating that the book is a wiki-based online “peer production” work in progress, which anybody can edit through www.wikinomics.com).

All well and good, but I noticed a few things. For starters, the ink-impressed bound-bulk-pulp version of the work has a cover price (US $25.95, Canada $32.50); named co-authors (Tapscott and Williams), who I suspect are pocketing royalties; and a publisher (Portfolio/Penguin Group) that underwrote the project and, no doubt, is accruing a tidy profit from it. For another, Tapscott was standing there on stage speaking to the assembled at a conference sponsored by a major software company, and I suspect he wasn’t wagging his tongue for free. For a third, I have no doubt that Business Objects was also paying for the “free to attendee” books, which, like the tote bag and delicious food (thanks again, Business Objects!), contributed to a splendid time for all. Finally, cracking open the book, I can’t help but be impressed by Tapscott’s discussion of the ample speaking engagements, sponsored research projects, and high-visibility consulting gigs that keep him and his organization gainfully employed. Cool….he’s doing well…no problem with any of that.

But something he said from the stage at the Business Objects show gave me pause, and he went into greater detail on it at the start of his book. He mentioned, as a prime example of “wikinomics” invading the “old economy,” a Canadian gold-mining firm (Goldcorp) that, a few years back, decided to publish, for free through its website, all of the geologic data pertaining to its mine in northern Ontario. The reason Goldcorp (spurred by CEO Rob McEwen) took this radical move was that its own staff geologists had increasingly come up short in their attempts to find new gold deposits on the property. So, in March 2000, McEwen took the controversial move to launch the “Goldcorp Challenge,” under which the firm offered $575,000 in prize money to anybody anywhere who could, by examining the now-free, now-public geologic data, tell where the gold most likely lay on their property.

As Tapscott tells it, “News of the contest spread quickly around the Internet, as more than one thousand virtual prospectors from fifty countries got busy crunching data.” And he quickly cuts to the chase: “The contestants had identified 110 targets on the Red Lake property, 50 percent of which had not been previously identified by the company. Over 80 percent of the new targets yielded substantial quantities of gold. In fact, since the challenge was initiated an astounding eight million ounces of gold have been found. McEwen estimates the collaborative process shaved two to three years off their exploration time. Today Goldcorp is reaping the fruits of its open source approach to exploration. Not only did the contest yield copious quantities of gold, it catapulted his underperforming $100 million company into a $9 billion juggernaut while transforming a backward mining site in Northern Ontario into one of the most innovative and profitable properties in the industry. Needless to say McEwen is one happy camper. As are his shareholders. One hundred dollars invested in the company in 1993 is worth over $3,000 today.”

Cool—essentially and effectively, the contest is based on the premise of “you mine our data, so we can better mine our mine.” But, thumbing through that section of the book, I see no further mention of the prize money, or the criteria for awarding it, or the ultimate winner(s), or whether it was split among multiple contestants, or whether any of them was also awarded with these now-valuable shares of Goldcorp stock, or whether any of them were later hired by the firm to be full-time staff geologists. I have no doubt that Goldcorp made a handsome payout to the winner(s) of the contest, but it would have closed the loop—in Tapscott’s account—if the name(s) of these pivotal gold-data-miners were mentioned.

This thought, sitting there listening to Tapscott, lead to another. This Goldcorp Challenge strikes me as a clever approach for extracting free geologic consulting services from many people, with only one (or a few? how many?) of the consultants seeing a payday, based on results delivered (literal paydirt identified) rather than effort expended. OK—the unnamed (in Tapscott’s book, at least) contestants seemingly knew the rules of the game (literally), so it was all on the up and up (apparently). So how is that a problem?

What struck me was the ironic parallel with Tapscott’s “Wikinomics” book, and with the whole “wikinomics” economic environment that it describes. Flip the book open to page 4, and look at the facing “Subtitles” page, and here’s what you’ll see.

On page 4, “[W]ith ‘Wikinomics,” we’re making a modest attempt to reinvent the concept of a book. You’ll notice that the final chapter, The Wikinomics Playbook, has only fifteen words: ‘Join us in peer producing the definitive guide to twenty-first-century strategy on www.wikinomics.com.” It is our hope that this book will transcend its physical form to become a living, real-time, collaborative document, cocreated by leading thinkers.”

On the facing page, this: “Books have a title page. This is our subtitle page. In what we believe to be a first, we’re listing a few of our favorite suggestions for subtitles gleaned from a public online discussion held the week of June 2, 2006. We received more than one hundred great suggestions in the first forty-eight hours. To our collaborators—you know who you are—we extend our most sincere thanks.”

“You know who you are”?!?!?! Why not list the actual names (can’t be more than 100, after all) right here, in the printed book (it has 324 pages total)? More to the point, why not give a flat fee or cut of the royalties to whoever (if anybody) won that challenge (i.e., whoever suggested the final “How Mass Collaboration Changes Everything”)? And, while we’re on the topic, why not publish revised editions of the printed for-a-price book that incorporate revisions submitted by others through www.wikinomics.com, and cut those people in on a share of the royalties (while giving due credit)? Sharing that gold would be a nice gesture, wouldn’t it?

All of which brings me around to the core point of this post, which is the intensifying collision between this newfangled virtual “wikinomics” world where we give it all away for free, and the oldfangled real day-job world where we receive the legal tender necessary to pay the bills. Clearly, most profits from our participation in the blogosphere are non-monetary, so it falls squarely into the “I’m marketing the brand of me” realm of “wikinomics.” I’ve been toying with various terms to describe this new phenomenon that don’t tie it to blogs or wikis or any other particular new “Web 2.0” technology. Months ago, Dana laughed when I suggested “probonosphere,” but now that strikes me as sounding like an obscure primate species. “Freebysphere” is cute, but sounds like a hot new Christmas-gift toy that we’ll discard in early January.

Right now, I’m leaning in the direction of “gratisphere” (though the linguistic purists will be irked by my combining a Latin word, “gratis” (i.e., free) with a Greek word “sphere”). And, as its opposite, I’m thinking of committing even more heinous linguistic offense: “career-o-sphere” (where we earn our daily bread). In other words, more and more of us are contributing for free on the gratisphere, and attempting to leverage those efforts to cash in via the career-o-sphere. Certainly, I’m doing that, as are most self-respecting analysts, since we all realize we must regularly gave away free samples of our brain power (e.g., blogs, podcasts, being quoted by reporters, jotting off uncompensated articles for online pubs, accepting unpaid invitations to speak at industry conferences) in order to remind the world that we’re here and that we’ve “got the goods” (a prime/premium feed of which they can access through separate channels).

Increasingly, IT publishers/editors are leveraging the gratisphere to their economic advantage (hey! free content from a leading analyst), which is no surprise (but which complicates the analyst/author/writer’s career-o-sphere equation, especially if, like Jim Kobielus, you’ve been paid for most of your published works for most of your career). In the pundit’s life, one must constantly ponder whether each new freebie-article request is a smart promotional move, or whether it delivers you more deeply into the chump-o-sphere. It’s a tricky balance, but more of us are working it out. As Pete Townshend once memorably put it, “This is no social crisis—just another tricky day for you.”

Getting back to Tapscott’s book, check out the quote he excerpts on pages 206-207 from Om Malik (“a well-read blogger and founder of GigaOmniMedia") and then Tapscott’s own immediate, brief, defensive, and inadequate retort:

  • Malik: “I wondered out loud if this culture of participation was seemingly help[ing] build businesses on our collective backs. So if we tag, bookmark, or share, and help del.icio.us or Technorati or Yahoo become better commercial entities, aren’t we seemingly commoditizing our most valuable asset—time. We become the outsourced workforce, the collective, though it is still unclear what is the pay-off. While we may (or may not) gain something from the collective efforts, the odds are whatever ‘the collective efforts’ are, they are going to boost the economic value of those entities. Will we share in their upside? Not likely!”
  • Tapscott: “Calling it exploitation goes too far.”

Wait just a second. No…it’s not going too far…if people’s contributions are delivering real value that somebody/somewhere across the virtual value chain is cashing in on, then the issue of equitable reward distribution is a legitimate topic for discussion…and if some are profiting immensely from (named) others’ efforts without cutting those others a piece of the pie, how is that not, on some level, akin to “exploitation”? So, in terms of distributional equity, in the abstract, the career-o-sphere should, on some level, acknowledge and compensate the gratisphere for its contributions. But, of course, that’s a tricky-as-hell proposition to work out in the real world, given the complex shifting membrane between these cyber-celestial spheres.

Many of us “subsidize” our gratispherical activities from the money (salaries, savings, etc.), time (evenings, weekends, vacations, coffee breaks, leisure, etc.), and other resources (professional connections, deep domain expertise) we’ve accrued from our career-o-spherical. We’re exploiting ourselves.

Leave it at that.

Jim

Tuesday, October 02, 2007

herenow Complex Event Processing VII

All:

Further/final continuation (concatenate this blog thread in your head from I through this VII, if you're so inclined):

This column—like my CIR, an “event” in its own right--prompted considerable feedback, some positive, some negative, from the CEP/ESP industry, and the various vendors mentioned. Every one of the CEP/ESP people who responded asked me how much I'm "familiar" with their space--and each of them tried to quote me chapter/verse from the EPTS reference model--which I'd already perused so many times, long ago, that it was starting to appear in my daydreams. I’ve taken additional CEP/ESP vendor briefings since then, and am planning to write some up when they go public—soon, and possibly write up an Advisory Report on this space. And that’s the life of any industry analyst. Filtering a passing parade of events, taking flak, and fending off flacks, like any reporter. You stick your neck out there, and you have to have a tough hide. Cuz it's tough to hide, when you're as published/public as I am.

Which brings me to a thought. People often ask me how I manage this all in my head and heart. As I’ve said before, it’s a bit like being a reporter on a beat—alerted and tuned in and ready to respond to “breaking news.”

But it’s also a bit like meditation—i.e., sitting in rapt attention in contemplation of a dynamic multifaceted multidimensional space, like a kaleidoscopic Buddhist mandala or prayer wheel or gem, focusing on nothing in particular, but aware of literally everything on some level. The following excerpt from one of my recent extracurricular readings (Damien Keown, “Buddhism: A Very Short Introduction,” Oxford University Press, 1996) just jumped out at me as absolutely bang-on brilliant. It’s all a matter of recognizing that one’s own reactions to the day’s events are themselves events that, like the external events themselves, are impermanent, a passing parade, a ghostlike abstraction that shall soon pass, hence underlining the need for the engaged-but-detached observer/analyst to stay supple and adaptive and to roll with the flow of the all-changing:

*************************

“To the kind of techniques already described, which in Buddhism go by the generic name of ‘calming meditation’ (samatha), the Buddha added a new one called ‘insight meditation’ (vipassana). The goal of this was not peace and tranquility but the generation of penetrating and critical insight (prajna). Whereas in calming meditation intellectual activity subsides at an early stage (on reaching the second jhana), in insight meditation the object of the exercise is to bring the critical faculties fully into play in a detailed reflexive analysis of the meditators’ own state of mind. In practice, the two techniques of calming and insight are normally used back-to-back within the same session: calming may be used first to concentrate the mind and then insight to probe and analyze. It is impossible to practice insight meditation without having reached at least the level of calm of the first jhana.

“In insight meditation, the meditator examines every aspect of his subject experienced, breaking this down into four categories: the body and its physical sensations; feelings; mood; and mental patterns and thoughts. A typical session might proceed by extending awareness of the rise and fall of the breath to the rest of the body. Every minor sensation would be noted such a twinges, aches, itches, and impulse to move and scratch. The meditator does not respond to these impulses since the purpose of the exercise is to note with bare attention how bodily sensations arise and subside without reacting to them in the normal semi-automatic way. By learning to observe without becoming involved, the pattern of stimulus-response which underlies much human behavior can be broken. Little by little the realization dawns that one is free to choose how to react in all situations regardless of which buttons are pushed. The grip of long-standing habits and compulsions is weakened and replaced with a new sense of freedom. The analysis is gradually extended to the whole body, the intellect being wielded like a surgeon’s scalpel to dissect the various bodily parts and functions. From this the awareness arises that the body is nothing more than a temporary assemblage of bones, nerves, and tissues, certainly not a worthy object to become infatuated with or excessively attached to.

“Next, attention is directed to whatever feelings arise. Pleasant and unpleasant feelings are noted as they arise and pass away. This sharpens the perception of impermanence and gives rise to the knowledge that even those things which seems most intimate to us—such as our emotions—are transient states which come and go. Next, the subject’s current mood and constant fluctuations in its overall quality and tone are observed, and finally the stream of thoughts which passes through the mind. The meditator must resist the temptation to lose himself in the daydreams and fantasies which inevitably arise. Instead, he simply observes with detachment as the thoughts and images follow one another, regarding them like clouds passing across a clear blue sky, or bubbles floating to the top of a glass. From this detached observation, it gradually becomes clear that even one’s conscious mind is but a process like everything else.”

*************************

So, then, consciousness is just complex event processing, event stream processing, in an event-driven arc of creation.

As an IT industry analyst, you must do a very focused, sustained form of CEP/ESP, meditating headlong at full tilt, constantly, in bare non-metaphorical prose, deeply engaged and fully exposed, acutely aware of and responding to what others are saying, in a restless, dynamic industry, under deadline.

Not in a sedentary lotus position, or with a fixed object/objective, or in a stable social order, or particularly calming, or profoundly insightful on any given day, or any given industry event.

And that's where the Buddha parallel ends.

Jim

herenow Complex Event Processing VI

All:

Further continuation:

Then, the following month, I leveraged this analysis and some other CEP/ESP-related CIRs I wrote up, plus other research material that I hadn’t found its way into any Current Analysis report, into the following Network World column, which was published on August 16:

**********************************

COMPLEX EVENT PROCESSING: STILL ON LAUNCHPAD

--James Kobielus

Complex event processing (CEP) has a sleek, shiny, space-age allure. CEP has been blinking on the information technology (IT) industry’s “next big thing” radar for quite a while, promising business agility through continuous correlation and visualization of multiple event-streams. Event-driven application architectures are becoming more important for modern business, as the volume of time-sensitive, real-time data that enterprise and carrier networks must process, store, and manage continues to expand.

However, CEP has yet to launch into the stratosphere of mainstream enterprise applications. For sure, the technology—also known as “event processing” or “event stream processing”--has found its niche with operational applications such as business activity monitoring (BAM), distributed process control, sensor networks, financial transaction surveillance, and integrated logistics management. But rare is the CEP application that supports the everyday needs of the average knowledge worker. CEP is still predominantly deployed as a stovepipe for specialized, albeit mission-critical, applications. And it is still primarily a vertical, industry-focused IT market segment, which is especially strong in finance, telecommunications, transportation, manufacturing, and the military.

None of which is to imply that the CEP market is not buzzing with activity or growing apace. The past several years have seen the entry of many promising, pure-play CEP vendors, including Agent Logic, Aleri, AptSoft, Coral8, Esper, GemStone, Kaskad, LeanWay, RiverGlass, SeeWhy, Syndera, StreamBase, and Vhayu. In recent months, Aleri, Coral8, SeeWhy, and StreamBase have issued important product enhancements that keep them in the forefront of industry innovation. In addition, established SOA, business process management (BPM), and middleware vendors such as TIBCO, Progress Software, BEA, and IBM have continued to beef up their CEP offerings through strategic acquisitions and product development.

But what’s conspicuously missing is any serious CEP uptake by business intelligence (BI) vendors, who could be instrumental in delivering real-time event streams to desktops, mobile devices, and other client environments. Consequently, most CEP tools must be implemented alongside users’ existing BI environments, providing a separate, event-optimized layer of visualization, dashboarding, modeling, repository, rules engine, resource connection, and administration tools.

What could explain this reluctance by BI vendors to test the CEP waters? To some degree, their wait-and-see posture reflects the slow uptake of real-time BI among their core enterprise customers. BI vendors have been beating the real-time drum for some time now. However, few BI users have been clamoring for the ability to refresh reports, dashboards, and scorecards continuously with straight-from-the-source event feeds. Many BI users can tolerate some latency in the delivery of key business data, and have been quite content to pull such data from intermediary data warehouses, which combine near-real-time data with historical information.

Nevertheless, CEP is an important complement to BI, and also to enterprise information integration (EII) solutions, which federate query/update operations directly to operational databases. It’s only a matter of time before most BI and EII vendors partner with CEP pure-plays, or acquire them outright, in order to strengthen their real-time event-driven functionality. We expect to see Business Objects, SAS, Cognos, Oracle/Hyperion, Microsoft, Information Builders, and MicroStrategy venture into the CEP arena in the next 1-2 years. Likewise, it’s very likely that a soon-to-be-independent Teradata, which has taken the lead in real-time data warehousing, will snatch up a CEP vendor to build out its real-time BI portfolio.

SOA middleware vendors will expand their CEP capabilities in order to offer event-driven architecture as an alternative or supplement to SOA. More vendors will CEP-enable their BPM environments’ BAM tools to support split-second response to changing business conditions. More enterprise service bus (ESB) vendors will invest in CEP to provide a user-friendly event aggregation, correlation, and visualization overlay to their publish-and-subscribe environments.

But the CEP market cannot achieve its full potential until the vendor community creates a consensus interoperability framework that leverages open SOA standards. One good sign is the recent founding of the Event Processing Technical Society (EPTS), a group of vendors and other interested parties that was created in late 2006 to build awareness of CEP’s applications, clarify CEP terminology, and define a CEP reference interoperability framework.

However, the EPTS has explicitly stated that it does not intend to become a standards organization, though it may work with standards groups at a later date. Unfortunately, the group has not yet produced a public draft of any such framework, nor has it attempted to reach out to groups such as the Organization for Advancement of Structured Information Standards (OASIS), which has developed a CEP-relevant standard: WS-Notifications, which supports event-driven, notification-based SOA interaction patterns.

The CEP market will evolve swiftly over the next few years as open standards emerge, as open-source alternatives appear, and as leading SOA, ESB, BI, and EII vendors acquire the most promising pure-plays. The industry’s tipping point toward ubiquity is fast approaching. By the end of this decade, the CEP arena will look very different, and enterprises will be able to deploy multi-application, vendor-agnostic, standards-based CEP infrastructures.

********************

Jim

herenow Complex Event Processing V

All:

Further continuation:


Then, once I had a handle on this complex multifaceted “event,” I banged out the CIR in the standard Current Analysis format. Here’s an abridged final text of what I published that day (almost final, that is—I’m sure my editors corrected typos etc prior to publication:

**************************************

Competitive Intelligence Report

Module Data Management

Report Title StreamBase Launches Next Generation Complex Event Processing Platform

Analyst(s) James Kobielus

Date June 20, 2007

Peer Reviewed By

Target Markets

B2B Communities, End Users, Global 2000, IT Implementers, Large Enterprises, Small to Medium Enterprises, Systems Integrators, Third Party Implementers

Analytical Summary

· Current Perspective: Positive on StreamBase’s announcement and preview of the forthcoming version 5.0 of its flagship complex event processing (CEP)/event stream processing (ESP) product platform. The vendor has further differentiated its CEP/ESP solutions by announcing a wide range of enhancements in connectivity, scalability, performance, development, presentation, security, administration, deployment, and platform integration.

· Vendor Importance: High to StreamBase, because it needs to continue differentiating itself through deep functionality in the fast-emerging but overcrowded CEP/ESP market, in which major SOA vendors—such as TIBCO, BEA, Progress Software, and IBM—have flagged as a high priority in their ongoing product strategies.

· Market Impact: High on the BI and DI market, because CEP/ESP—also known as “event processing”--is a high priority in the evolution of the service-oriented architecture (SOA), enterprise service bus (ESB), business intelligence (BI), data warehousing (DW), and data integration (DI) markets; because StreamBase is a fast-growing CEP/ESP pure-play with a strong product family; and because the next-generation of StreamBase’s product family, now in advanced beta, will include a broad range of enhancements that raise the functionality bar for it many competitors,

Analysis Section

Perspective

We are taking a positive stance on StreamBase’s announcement and preview of the forthcoming version 5.0 of its flagship complex event processing (CEP)/event stream processing (ESP) product platform. The vendor has further differentiated its CEP/ESP solutions by announcing a wide range of enhancements in connectivity, scalability, performance, development, presentation, security, administration, deployment, and platform integration.

StreamBase’s announcement confirms trends discussed elsewhere (see the Business Intelligence and Data Integration market assessments under Data Management). First, real-time, event-driven application architectures are becoming more important in e-business environments, a long-running trend that continues to stimulate the development and growth of the CEP/ESP market. Second, the volume of time-critical events, messages, and other data that enterprise and carrier networks must process, store, and manage continues to grow, a trend that places a high priority on robust, scalable, high-availability, high-performance CEP/ESP. Third, open-source platforms such as Eclipse continue to grow in importance, thereby spurring vendors to incorporate those platforms into their product architectures. Fourth, interactive browser-based visualization is becoming a standard feature of many data-rich applications, a trend that is causing vendors to implement a growing range of rich Internet application (RIA) standards.

StreamBase’s announcement offers the following value points for its customers. First, StreamBase announced that StreamBase 5.0 will be generally available by September 30. Second, the release will full integrate Eclipse into the StreamBase Studio development tool. Third, the release will include an extended industry-specific Application Framework for algorithmic equities trading. Third, the release will support multi-event CEP pattern matching through an enhanced rules syntax. Fourth, it will add support for processing and persistence of binary large objects (BLOBs). Fifth, it will introduces integration with several third-party RIA environments. Sixth, it will include an upgrade to the Chronicle persistence framework for time-series data, supporting optimized read/write integration and bulk loading with third-party repositories from IBM, Sybase, and Vertica. Seventh, it will include new advanced security capabilities, including event-level security support, network data encryption, and secure, role-based authorization. Eighth, it will introduce an Eclipse-based Adapter Toolkit to connect StreamBase CEP applications to virtually any data source. Ninth, it will introduce many enhancements in administration and deployment.

StreamBase’s announcement was a necessity for the vendor to continue differentiating itself through deep functionality in the fast-emerging but overcrowded CEP/ESP market. In addition, StreamBase continues to scale its CEP/ESP software and optimize its architecture more tightly with its principal hardware and software partners’ platforms. Furthermore, the vendor is providing application developers with a more flexible pattern-matching rules syntax for creating optimized CEP/ESP algorithms for processing and persisting massive amounts of time-critical data. And StreamBase has integrated its CEP/ESP development tool fully with Eclipse, allowing customers to leverage and extend their commitment to that open-source platform.

However, StreamBase, like all CEP/ESP vendors, has staked out a narrow market segment that has not yet been able to penetrate the enterprise mainstream in most customer verticals. Also, the vendor is vulnerable to diversification by its hardware partners—and by SOA, ESB, BI, DW, and DI vendors generally--into the CEP/ESP market. Furthermore, StreamBase has not provided a roadmap under which it will roll out industry-specific CEP/ESP application frameworks for any verticals other than financial services. And it has not introduced any interfaces to third-party business process management (BPM) environments that would enable its CEP/ESP platform to support real-time, event-driven workflows or “closed-loop” operational BI.

StreamBase’s announcement sends a signal that it intends to continue evolving its CEP/ESP platform rapidly to differentiate itself from rivals in this crowded, competitive segment. Rival CEP/ESP vendors should immediately state how they match or surpass StreamBase’s latest release in connectivity, scalability, performance, development, presentation, security, administration, deployment, and platform-integration features. Vendors of SOA, ESB, BI, DW, or DI offerings that lack CEP/ESP products should scout for strategic acquisitions from a very promising field of startups. Existing StreamBase users should begin to evaluate the current beta of StreamBase 5.0 right away and urge the vendor to make good on its promise to make this new version generally available by the end of September.

In summary, StreamBase remains a pacesetter in the emerging CEP/ESP marketplace, and its forthcoming version 5.0 release of its flagship product platform will help it to hold existing customers and attract prospective customers who are new to this market.

Positives and Concerns

Competitive Positives


· StreamBase has further differentiated its complex event processing (CEP)/event stream processing (ESP) platform by announcing a wide range of enhancements for the forthcoming version 5.0, which is due to be released by the end of the third quarter. StreamBase 5.0, which was previewed this week at a financial services industry conference, offers new functionality in CEP/ESP connectivity, scalability, performance, development, presentation, security, administration, deployment, and platform integration.


· StreamBase continues to scale its CEP/ESP software and optimize its architecture more tightly with its principal hardware and software partners’ platforms. StreamBase 5.0 enhances the vendor’s Chronicle persistence framework, which further improves the platform’s read/write integration with high-capacity, time-series event-data stores, such as IBM DB2 9, Sybase Real-time Analytics Platform, and Vertica. This release will also add support for processing and persisting binary large objects.


· StreamBase is providing CEP/ESP application developers with greater flexibility in writing custom rules for real-time, low-latency, complex event processing with sophisticated pattern matching. StreamBase 5.0 will introduce an enhanced pattern-matching syntax, which will help developers to define rules that recognize the order, presence, or absence of complex combinations of real-time events. This release will allow patterns to be identified within single event streams or across multiple parallel streams over any given period.


· StreamBase has integrated its CEP/ESP development tool fully with Eclipse. In the upcoming release, the StreamBase Studio tool is being enhanced to support graphical CEP/ESP application development involving Eclipse plug-ins, Java code, and the vendor’s StreamSQL query language. From within the tool, application developers will be able to access a full set of StreamBase-provided Eclipse plug-ins for source-code version control, task management, graphical UI development and integration, XML editors, and SQL design.


· StreamBase has integrated its CEP/ESP platform with a broad range of rich Internet application (RIA) environments, supporting more interactive, browser-based visualization of complex, real-time patterns and trends. StreamBase 5.0 will integrate with Adobe Flex, Microsoft Windows Presentation Foundation, Java Swing, and Eclipse Standard Widget Toolkit. The platform will enable bi-directional interaction in any of those RIA environments, a useful feature when CEP/ESP applications are designed to monitor and control systems and other resources in real time.

· StreamBase has expanded the connectivity, security, and administration features of its platform, providing an even more robust platform for mission-critical, real-time event-streaming applications. StreamBase 5.0’s Eclipse-based Adapter Toolkit enables will allow CEP/ESP applications to connect to virtually any data source. The release will add event-level security support, network data encryption, secure directory interfaces, and role-based user authorization. And it will introduce enhanced remote administration, error management, cross-application data sharing, and other new administration and deployment features.

Competitive Concerns


· StreamBase, like all CEP/ESP vendors, has staked out a narrow market segment that has not yet been able to penetrate the enterprise mainstream in most customer verticals. Most CEP/ESP vendors are primarily addressing the real-time event-processing requirements of three verticals: financial services, telecommunications, and government/military. Though to some degree this niche can be regarded as a segment of the operational business intelligence (BI) market, few enterprises consider CEP/ESP in their BI planning exercises.


· StreamBase is vulnerable to diversification by its hardware partners—and by SOA, ESB, BI, DW, and DI vendors generally--into the CEP/ESP market. Over time, CEP/ESP will become a core, integrated feature of all computing platforms, and of the SOA, BI, DW, and DI environments that integrate tightly into those platforms. This trend can be seen in recent moves by TIBCO and BEA into the CEP/ESP segment, and by IBM’s recent announcement of plans to productize its “stream computing” technology.

· StreamBase has not provided a roadmap under which it will roll out industry-specific CEP/ESP application frameworks for any verticals other than financial services. Also, StreamBase lacks packaged interactive-visualization applications that it can use to address the real-time BI and CEP requirements of distinct horizontal and vertical markets. In this latter regard, StreamBase and other CEP/ESP vendors lag behind TIBCO, which will very likely leverage Spotfire’s “guided analytics” to target particular CEP/ESP horizontal and vertical segments.

· StreamBase has not introduced any interfaces to third-party business process management (BPM) environments that would enable its CEP/ESP platform to support real-time, event-driven workflows or “closed-loop” operational BI. In this regard, StreamBase lags behind SeeWhy, which has introduce a toolkit in the latest version of its CEP/ESP platform that enables standards-based integration with third-party BPM and rules-engine environments.


· StreamBase has previewed a broad range of new CEP/ESP features that will not be generally available for another three months. Prospects who find these latest announcements enticing may get frustrated waiting for StreamBase to ship the promised release. Just as serious, prospects may get distracted by announcements that other CEP/ESP vendors, established and startup, are likely to make in coming months.

Recommended Actions

Recommended Vendor Actions

· StreamBase should seek to be acquired by a leading SOA, ESB, BI, DW, or DI vendor, so as to increase its visibility in today’s crowded CEP/ESP market; leverage the acquirer’s R&D resources; tap into the acquirer’s global sales, marketing, and channels; and integrate its offerings tightly into a best-of-breed enterprise software product family.

· To drive its CEP/ESP technology into the enterprise primetime market, StreamBase should consider OEM’ing to SOA, ESB, BI, DW, and/or DI vendors; developing midmarket packaging, pricing, partnerships, and go-to-market messaging for its solutions; and delivering the functionality through new approaches/channels such as software-as-a-service (SaaS), purpose-built appliances, and open source.

· To ensure that it can continue to scale up its CEP/ESP technologies using low-cost distributed approaches, StreamBase should recruit a broader range of technology partners--including HP, EMC, Intel, Sun, Teradata/NCR, and Cisco—and focus on grid, in-memory, multi-core, and other high-performance, scalable computing architectures. StreamBase should also regularly publish third-party benchmarks that vouch for its performance and scalability in processing high-volume event streams, so as bolster its claims in this regard.

· To expand the range of vertical markets that it can address with packaged CEP/ESP solutions, StreamBase should develop application framework/accelerators for telecommunications, government/military, transportation/logistics, media, pharmaceuticals, consumer packaged goods, and other industries. Just as important, StreamBase should recruit consulting, system integrators, value-added resellers, and other channel partners in those verticals, so that it can engage their domain expertise in developing solutions tailored to those segments’ specific requirements.

· To expand its ability to sell its CEP/ESP solutions in closed-loop operational BI environments, StreamBase should introduce standards-based interfaces to third-party BPM rules-engine environments. At the very least, StreamBase should implement a WS-Business Process Execution Language (BPEL) interface, as well as tighter integration with leading BPM engines from TiBCO, IBM, Microsoft, Lombardi, and others.

**********************************

Jim

herenow Complex Event Processing IV

All:

Further continuation:

So I chose to write a CIR on StreamBase, covering all of these announcements. I usually like to get a briefing from the vendor in question before or during writing the CIR. I believe I also spoke with StreamBase during that time (though I’m not 100 percent sure---my notes are back home right now—I’m on the road covering another vendor right now). Anyway, regardless of the vendor or the announcement(s), for each and every CIR I prepare a to-myself notes doc that I call a “raw stuff.” What these notes do is boil the event down to the most concise, BS-free, description of the event (i.e., of actual substance of the vendor’s specific announcements), to serve as the starting point for my analysis. Here’s my StreamBase “raw stuff,” distilled/rewritten from the press releasese (and any other relevant info/notes in my possession) on that particular day:

****************************************

StreamBase raw stuff jun 19 2007

June 19, 2007 -- StreamBase Systems announced StreamBase 5.0, the latest generation of its complex event processing (CEP) platform. StreamBase is previewing various components of StreamBase 5.0 at the Securities Industry and Financial Markets Association (SIFMA) Technology Management Conference & Exhibit in New York City from June 19-21. StreamBase 5.0 is expected to be generally available sometime before September 30, 2007. A free trial of the StreamBase software is currently available for download at www.streambase.com/developers. Initial launch partners of StreamBase 5.0 include IBM, Microsoft, and Sybase.

StreamBase 5.0 includes the following enhancements:

· Eclipse-based Integrated Development Environment (IDE): StreamBase 5.0 completes the transition of the Studio IDE to Eclipse to support graphically oriented development integration between StreamSQL, Java, and Eclipse plug-ins. StreamBase 5.0 developers can extend their StreamBase CEP applications via custom operators and integrations with external systems. In addition, StreamBase application developers are now able to access and leverage the entire ecosystem of Eclipse plug-ins from within the StreamBase development environment, including plug-ins for source code version control, task management, graphical UI development and integration, XML editors, and SQL design tools.

· End-to-End Application Frameworks: StreamBase 5.0’s open development platform introduces industry-specific Application Frameworks, with the first one designed for algorithmic equities trading; this is an extension to previously announced solution sets for Reg NMS and Markets in Financial Instruments Directive (MiFID). The StreamBase Algorithmic Trading Framework is based upon a number of proven best practices for developing the full range of real-time components comprising an algorithmic trading application. The framework helps speed the development, customization, and delivery of these applications, including trading strategies, execution strategies, real-time P&L management, and real-time transaction cost analysis.

· Multi-Event CEP Pattern Matching: StreamBase 5.0 introduces enhanced pattern-matching syntax, enabling users to more easily develop applications that recognize the order, presence, or absence of complex combinations of events in real-time. Patterns can be identified within single event streams or across multiple parallel streams over any given period – whether a response is desired in real-time or over an extended time interval. With these enhancements, developers can quickly build complex event recognition applications that may be used for real-time fraud and intrusion detection, network monitoring, click stream analysis, anti-money laundering, and more.

· New Support for Advanced Data Types: StreamBase 5.0 introduces support for Binary Large Objects (BLOBs), enhancing the platform’s ability to address multimedia and document-centric application requirements, in addition to existing support for video, image, audio, XML payloads, unstructured text, and other data types.

· Expanded Data Visualization Tools for Real-Time Monitoring: StreamBase 5.0 introduces integration with several third-party visualization tools, including Adobe Flex, Microsoft Windows Presentation Foundation (WPF), Java Swing, and Eclipse Standard Widget Toolkit (SWT). StreamBase 5.0 provides bi-directional interaction with user interfaces built from these tools, enabling the creation of dashboards and other visual interfaces for monitoring and controlling real-time applications.

· High-Capacity Data Management & Persistence Framework: StreamBase 5.0 Chronicle is the second major release of the enhanced and extended persistence framework for time series data. Chronicle now provides optimized read/write integration with industry-leading, high-capacity tick stores including IBM DB2, Sybase RAP, and Vertica. In addition, Chronicle also supports bulk-loaders for these high-capacity tick stores and continues to leverage the standard JDBC interface for connecting to other historical databases.

· Enterprise-Class Security: StreamBase 5.0 includes new advanced security capabilities, including event-level security support, network data encryption, user authentication through secure integration with LDAP servers, and role-based authorization to control user access and activities.

· End-to-End Integration with External Data Sources: StreamBase 5.0 introduces the ability to use StreamBase’s Eclipse-based Adapter Toolkit to connect StreamBase CEP applications to virtually any data source. In addition, StreamBase 5.0 provides a wide range of pre-built adapters to all major market data and messaging infrastructure systems. StreamBase 5.0 adapters include TIBCO Rendezvous and EMS, Java Message Service (JMS), Reuters Market Data System (RMDS), Wombat, Bloomberg, IBM WebSphere Front Office (WFO), and opentick, enabling developers to tie data feeds directly into their StreamBase applications and provide end-to-end integration with any system.

· Enhanced Administration & Run-Time Features: StreamBase 5.0 introduces improved remote administration capabilities, error management and reporting, deployment flexibility with rolling upgrades, flexible data sharing capabilities across application components, and dozens of other improvements.

In addition, StreamBase 5.0 builds on the high-performance features of previous release, and is s — and remains the fastest CEP server available today, capable of processing hundreds of thousands of messages per second per CPU. In addition, the new StreamBase 5.0 enhancements further speed the development and delivery of CEP applications that address the rapidly growing real-time processing demands of customers and partners worldwide. StreamBase 5.0 offers built-in support for IBM’s DB2 data server, WebSphere Front Office, and xSeries hardware.

StreamBase also unveiled a comprehensive CEP Reference Architecture for Algorithmic Trading for accelerating the time-to-market and increasing the extensibility of real-time algorithmic (algo) trading applications. Developed by StreamBase and Microsoft, the new CEP Reference Architecture is based on industry best practices and describes the critical real-time sections of an algo trading system and its key presentation layer components. The CEP Reference Architecture for Algorithmic Trading delivers an end-to-end framework for designing an algorithmic equities trading application and also describes the interconnection between its key real-time and user interaction components. The key real-time components are Market Data Cleansing, Data Enrichment, Trading Strategies, Risk Management, Execution Strategies, and Market Impact / Implementation Shortfall processing. Microsoft’s WPF enables rich client applications, and its powerful and flexible programming model integrates support for flexible layout, high-quality text, resolution-independent graphics, animation, video and 3D.

Furthemore, Sybase, Inc. announced that Sybase’s Real-time Analytics Platform, a highly optimized real-time data processing service platform, now integrates with StreamBase’s CEP platform. The joint solution will support real-time applications having large storage requirements, such as back-testing for algorithmic trading, risk analysis and historical trade auditing. Sybase Real-time Analytics Platform is a high performance enterprise-wide solution that delivers in-memory transaction processing, massive time-series data management, deep historical data analysis and is built on Sybase’s capital market industry-proven data management and patented data analytics technologies that have been enhanced to perform in a highly scaleable manner. Sybase Real-time Analytics Platform can deliver virtual market data feeds at accelerated speeds to match StreamBase’s high-performance CEP platform and targets a wide range of real-time front, middle and back office applications including:

  • Back-testing trading strategy algorithms using virtual feeds of historical data
  • Algorithmic trading applications running queries against both real-time market data and massive historical data repositories of 100+ terabytes
  • Validating predictive modeling applications by comparing predicted events to actual events
  • Performing pre-trade risk and compliance analysis

****************************************

Jim

herenow Complex Event Processing III

All:

Before I lose the train of thought--here's the continuation of that same thoughtstream:

From these “clipped” articles, I often identify one or more event—i.e., a vendor announcement encapsulated in a press release--that seems important enough to be the subject of a Current Analysis event report—a Competitive Intelligence Report (CIR). Bear in mind that there are a great many vendor announcements that don’t change the competitive balance, hence get filtered out of my equation. You know what I’m talking about: the vast majority of the IT vendor puff PR stuff that hits the wires. I ignore press releases of this sort:

  • Vendor so-and-so just landed in Gartner’s Magic Quadrant yet again, or was lauded by Forrester or AMR for being so gosh-darn innovative.
  • Vendor so-and-so just notched another impressive customer win, and is delivering a sensational solution that is revolutionizing life as we know it.
  • Vendor so-and-so just hired Joe or Jane Blow to be the executive VP for solution marketing.
  • Vendor so-and-so had yet another profitable quarter.
  • Vendor so-and-so’s solution won Portugal’s Good Software Design Award for the third year running.
  • Vendor’s so-and-so’s CTO is speaking to Wall Street Analysts on the theme of “SOA: What’s it Stand For?”
  • Vendor so-and-so finally ships v5.1.8.7 SP 3.4 of its SplenDaWare 2000 product, incorporating 4,132 incremental feature tweaks that only diehard users really understand or care about.

The ones that I do pay attention to are those from substantial vendors who are making substantial announcements in substantial—albeit sometimes still emerging--markets. For example, the following press releases from StreamBase, a CEP/ESP software vendor, collectively jumped out at me on that particular day:

**********************************

StreamBase press releases jun 19 2007

June 19, 2007 01:41 PM Eastern Daylight Time

StreamBase 5.0 Delivers First End-to-End Complex Event Processing Platform

CEP Leader Reveals Industry’s Most Complete, High-Performance CEP Platform At SIFMA’s Technology Management Conference

Technology Management Conference & Exhibit

NEW YORK--(BUSINESS WIRE)--Today, StreamBase Systems revealed StreamBase 5.0, which raises the bar by delivering the first end-to-end Complex Event Processing (CEP) platform — a major new release enabling rapid development and deployment of real-time applications on the industry’s most proven platform.

IBM’s Vice President of Information Management, Arvind Krishna commented, “We are pleased that StreamBase’s 5.0 Complex Event Processing platform offers built-in support for IBM’s DB2® data server, WebSphere® Front Office, and xSeries® hardware, and look forward to continuing our technology partnership and enabling CEP capabilities and benefits that will appeal to a broad array of enterprises everywhere. By combining the power of IBM and StreamBase, customers can have the confidence to address even the most complex, real-time business challenges, and do so in a scalable, maintainable, and proven manner.”

StreamBase 5.0 further pioneers the next-generation of CEP by introducing major new functionality and addressing dozens of customer feature requests related to developer productivity, out-of-the box application frameworks, end-to-end application development, expanded support for advanced data types, flexible pattern matching, enterprise-class security, and high-capacity tickstore support. This release also continues to build upon the industry-leading performance of previous releases — and remains the fastest CEP server available today, capable of processing hundreds of thousands of messages per second per CPU. In addition, the new StreamBase 5.0 enhancements further speed the development and delivery of CEP applications that address the rapidly growing real-time processing demands of customers and partners worldwide.

“In the four years since StreamBase pioneered this technology, Complex Event Processing has matured enormously and StreamBase 5.0 delivers the leading-edge capabilities that will make CEP the ubiquitous approach for building real-time applications,” said Dr. Michael Waclawiczek, Senior Vice President, Products at StreamBase Systems. “We have solicited feedback and input from over two dozen customers, and their requirements are incorporated into StreamBase 5.0.”

StreamBase 5.0 includes:

Eclipse-based Integrated Development Environment (IDE) — With StreamBase 5.0, StreamBase completes the transition of its Studio IDE to Eclipse resulting in the seamless development integration between StreamSQL, Java, and Eclipse plug-ins . By using one simple, intuitive graphical platform based on Eclipse, the de facto standard for Java application development, StreamBase 5.0 developers can easily extend their StreamBase CEP applications via custom operators and integrations with external systems. In addition, StreamBase application developers are now able to access and leverage the entire ecosystem of Eclipse plug-ins from within the StreamBase development environment. Such plug-ins include those for source code version control, task management, graphical UI development and integration, XML editors, and SQL design tools.

End-to-End Application Frameworks As an extension to previously announced solution sets for Reg NMS and Markets in Financial Instruments Directive (MiFID), StreamBase 5.0’s open development platform introduces industry-specific Application Frameworks, with the first one designed for algorithmic equities trading. The StreamBase Algorithmic Trading Framework is based upon a number of proven best practices for developing the full range of real-time components comprising an algorithmic trading application. The framework helps speed the development and delivery of these applications, including trading strategies, execution strategies, real-time P&L management, and real-time transaction cost analysis. [SEE ALSO: STREAMBASE LAUNCHES CEP REFERENCE ARCHITECTURE FOR ALGORITHMIC TRADING, June 19, 2007].

“Developed by StreamBase and third-party vendors and programmers, our Application Frameworks are part of a strategic initiative to help accelerate the delivery of solutions that today’s competitive organizations need to meet key challenges around acquiring, processing, and acting on high-volume real-time data,” said Waclawiczek. “Customers and partners can now leverage these Application Frameworks to design and implement specific solution sets that address their core business pains.”

Multi-Event CEP Pattern Matching — By introducing enhanced pattern-matching syntax, StreamBase 5.0 enables users to more easily develop applications that recognize the order, presence, or absence of complex combinations of events in real-time. Patterns can be identified within single event streams or across multiple parallel streams over any given period – whether a response is desired in real-time or over an extended time interval. With these enhancements, developers can quickly build complex event recognition applications that may be used for real-time fraud and intrusion detection, network monitoring, click stream analysis, anti-money laundering, and more.

New Support for Advanced Data Types — With the growing adoption of CEP in various industries, there is an increased demand for StreamBase’s platform to process, analyze, and manage a variety of new data types including video, image, audio, XML payloads, and unstructured text. StreamBase 5.0 addresses this by introducing support for Binary Large Objects (BLOBs). BLOB support greatly enhances StreamBase’s ability to meet multimedia and document-centric application requirements in both existing industries and new markets that wish to embrace CEP.

Expanded Data Visualization Tools for Real-Time Monitoring StreamBase 5.0 introduces integration with a number of third-party visualization tools, including Adobe® Flex™, Microsoft Windows® Presentation Foundation™ (WPF), Java™ Swing, and Eclipse Standard Widget Toolkit (SWT). StreamBase 5.0 provides bi-directional interaction with user interfaces built from these tools, enabling the creation of powerful dashboards and other visual interfaces for monitoring and controlling real-time applications.

High-Capacity Data Management & Persistence Framework StreamBase 5.0 Chronicle™ is the second major release of the enhanced and extended persistence framework for time series data. Chronicle now provides optimized read/write integration with industry-leading, high-capacity tick stores including IBM DB2®, Sybase RAP®, and Vertica®. In addition, Chronicle also supports bulk-loaders for these high-capacity tick stores and continues to leverage the standard JDBC interface for connecting to other historical databases. [SEE ALSO: SYBASE ANNOUNCES NEW FINANCIAL SERVICES REAL-TIME ANALYTICS AND COMPLEX EVENT PROCESSING PLATFORM WITH STREAMBASE , June 19, 2007]

Enterprise-Class Security — In addition to offering enterprise-class clustering, high availability, and being SMP-enabled, StreamBase 5.0 also includes new advanced security capabilities. Adding to existing security features, StreamBase 5.0 now offers unique event-level security support, network data encryption, user authentication through secure integration with LDAP servers, and role-based authorization to control user access and activities. With this release, StreamBase offers the industry’s most comprehensive security implementation for enterprise-class CEP applications.

End-to-End Integration with External Data Sources — Using StreamBase’s Eclipse-based Adapter Toolkit, developers can connect their StreamBase CEP applications to virtually any data source. In addition, StreamBase 5.0 provides a wide range of pre-built adapters to all major market data and messaging infrastructure systems. StreamBase 5.0 adapters include TIBCO Rendezvous® and EMS®, Java Message Service (JMS), Reuters Market Data System (RMDS), Wombat, BloombergSM, IBM WebSphere® Front Office (WFO), and opentick, enabling developers to seamlessly tie data feeds directly into their StreamBase applications and provide end-to-end integration with any system.

Enhanced Administration & Run-Time Features — With improved remote administration capabilities, error management and reporting, deployment flexibility with rolling upgrades, flexible data sharing capabilities across application components, and dozens of other improvements, StreamBase 5.0 offers the industry’s most complete, robust and mature CEP platform.

"Complex Event Processing has proven itself as a crucial technology," said Sang Lee, Research Director at Aite Group. "With the release of StreamBase 5.0, the company is further demonstrating its commitment to address the rising demand for enterprise-class CEP. StreamBase is helping to drive the evolution and adoption of CEP not only by financial services institutions, but also by many other sectors that need to solve challenges around low-latency processing of high-volume data."

StreamBase is currently previewing various components of StreamBase 5.0 at the Securities Industry and Financial Markets Association (SIFMA) Technology Management Conference & Exhibit in New York City from Tuesday, June 19th through Thursday, June 21st at booth # 1770. Initial launch partners of StreamBase 5.0 include IBM®, Microsoft®, and Sybase®.

StreamBase 5.0 is expected to be generally available sometime before September 30th, 2007. A free trial of the StreamBase software is currently available for download at www.streambase.com/developers.

About StreamBase

StreamBase's award-winning Stream Processing Platform, fueled by the standards-based next generation query language, StreamSQL™ and an Eclipse-based development environment, offers the fastest Complex Event Processing software for processing of real-time and historical data. With StreamBase and StreamSQL, enterprises can query, process, and analyze real-time and stored data at rates of up to hundreds of thousands messages/second. StreamBase’s combination of real-time performance, persistence, and programmability empowers enterprises in industries like financial services, telecom and networking, e-Business, government and military to solve new classes of business challenges in a more timely, scalable, and cost effective manner than custom-coding. StreamBase is headquartered in Lexington, Massachusetts with offices in New York, Washington, D.C., and London. A downloadable version of StreamBase's software is available at http://www.streambase.com/.

© 2007 StreamBase Systems, Inc. All other trademarks or trade names are properties of their respective owners. All rights reserved.

Contacts

StreamBase
Donna Parent, 781-761-0841
donna.parent@streambase.com
or
PerkettPR
Christine Major, 603-743-4534
streambase@perkettpr.com

June 19, 2007 01:42 PM Eastern Daylight Time

StreamBase Launches CEP Reference Architecture for Algorithmic Trading

Industry-First End-to-End Framework For Algo Trading Speeds Application Design & Development

Technology Management Conference & Exhibit

NEW YORK--(BUSINESS WIRE)--StreamBase Systems, Inc., the leader in high-performance Complex Event Processing (CEP), today unveiled a comprehensive CEP Reference Architecture for Algorithmic Trading for accelerating the time-to-market and increasing the extensibility of real-time algorithmic (algo) trading applications. Developed by StreamBase and Microsoft Corp., the new CEP Reference Architecture is based on industry best practices and describes the critical real-time sections of an algo trading system and its key presentation layer components.

“We are excited to align Microsoft Windows Presentation Foundation (WPF) with StreamBase’s CEP platform for building real-time algo trading apps based on the newly released Reference Architecture,” said Craig Saint-Amour, U.S. capital markets industry solutions director, Microsoft Corp. “With financial institutions facing low-latency, high-volume processing demands, we understand that increasing developer productivity and accelerating time-to-market are critical areas fueling competitive advantage. Microsoft and StreamBase both recognize the importance of a powerful presentation layer for algorithmic trading applications and the combination of WPF with the StreamBase CEP platform is a great fit.”

The CEP Reference Architecture for Algorithmic Trading delivers an end-to-end framework for designing an algorithmic equities trading application and also describes the interconnection between its key real-time and user interaction components. The key real-time components are Market Data Cleansing, Data Enrichment, Trading Strategies, Risk Management, Execution Strategies, and Market Impact / Implementation Shortfall processing.

Microsoft’s WPF enables rich client applications, and its powerful and flexible programming model integrates support for flexible layout, high-quality text, resolution-independent graphics, animation, video and 3D. StreamBase provides the fastest, most scalable, high-performance CEP platform in the industry – enabling the development of new real-time applications much faster than with traditional custom coding, in weeks not months. [SEE ALSO: STREAMBASE 5.0 DELIVERS FIRST END-TO-END COMPLEX EVENT PROCESSING PLATFORM, June 19, 2007].

“As the volume and velocity of financial market data continue to explode, staying ahead of the competition requires the right tools and infrastructure,” said John Partridge, Vice President, Industry Solutions at StreamBase. “This new CEP Reference Architecture will give developers a competitive edge for building real-time algo trading applications with a fully realized user interface that makes it easy to monitor and control the application’s response to the market.”

To learn more about the CEP Reference Architecture for Algorithmic Trading delivered by Microsoft and StreamBase, visit the Microsoft Developer Network (MSDN) to download a solutions whitepaper, which will be posted shortly after the Conference. [http://msdn2.microsoft.com/en-us/architecture/aa699365.aspx%5D

In addition, please visit StreamBase at booth #1770 during the Securities Industry and Financial Markets Association (SIFMA) Technology Management Conference & Exhibit, June 19-21 in New York, NY to learn more about the new CEP Reference Architecture for Algorithmic Trading and see a WPF / StreamBase CEP demonstration.

About StreamBase

StreamBase's award-winning Stream Processing Platform, fueled by the standards-based next generation query language, StreamSQL™ and an Eclipse-based development environment, offers the fastest Complex Event Processing software for processing of real-time and historical data. With StreamBase and StreamSQL, enterprises can query, process, and analyze real-time and stored data at rates of up to hundreds of thousands messages/second. StreamBase’s combination of real-time performance, persistence, and programmability empowers enterprises in industries like financial services, telecom and networking, e-Business, government and military to solve new classes of business challenges in a more timely, scalable, and cost effective manner than custom-coding. StreamBase is headquartered in Lexington, Massachusetts with offices in New York, Washington, D.C., and London. A downloadable version of StreamBase's software is available at http://www.streambase.com/.

© 2007 StreamBase Systems, Inc. All other trademarks or trade names are properties of their respective owners. All rights reserved.

Contacts

StreamBase
Donna Parent, 781-761-0841
donna.parent@streambase.com
or
PerkettPR
Christine Major, 603-743-4534
streambase@perkettpr.com

Sybase Announces New Financial Services Real-Time Analytics and Complex Event Processing Platform with StreamBase

Combined Platforms Enable Complex Analytics with High-Volume, Real-Time and Historical Data

DUBLIN, CA — JUNE 19, 2007 — Sybase, Inc. (NYSE: SY), a leading provider of enterprise infrastructure and mobile software, today announced that Sybase’s Real-time Analytics Platform, a highly optimized real-time data processing service platform, now integrates with StreamBase’s high-performance Complex Event Processing (CEP) platform. The joint solution will support real-time applications having large storage requirements, such as back-testing for algorithmic trading, risk analysis and historical trade auditing.

The driving market forces behind this solution include competitive pressure to build complex quantitative trading strategies, a heightened demand for combined static and dynamic data analysis, and the explosive growth of market data volumes. "Vendors are recognizing the market demand for applications that can help institutions integrate real-time data, historical data, and overlay analytics to provide both derived data and actionable output,” said Ben Butterfield, senior research associate at The Tower Group.

“We’re elated to work with Sybase,” said Dr. Michael Waclawiczek, senior vice president, products, StreamBase. “As a leader in enterprise infrastructure and mobile software, Sybase aligns perfectly with our industry-leading CEP platform. We look forward to working with Sybase to bring this joint solution to our customers and to further address today’s real-time data intensive challenges.”

Sybase® Real-time Analytics Platform is a high performance enterprise-wide solution that delivers in-memory transaction processing, massive time-series data management, deep historical data analysis and is built on Sybase’s capital market industry-proven data management and patented data analytics technologies that have been enhanced to perform in a highly scaleable manner.

“Based on our extensive experience in financial services and leading market share, Sybase recognizes the importance of high performance CEP for the industry,” said Steve Capelli, president, worldwide field operations. “The combination of StreamBase and Sybase Real-time Analytics Platform provides a powerful platform on which financial services firms can build strategic real-time applications.”

Sybase Real-time Analytics Platform can deliver virtual market data feeds at accelerated speeds to match StreamBase’s high-performance CEP platform and targets a wide range of real-time front, middle and back office applications including:

  • Back-testing trading strategy algorithms using virtual feeds of historical data
  • Algorithmic trading applications running queries against both real-time market data and massive historical data repositories of 100+ terabytes
  • Validating predictive modeling applications by comparing predicted events to actual events
  • Performing pre-trade risk and compliance analysis

Please visit the Sybase booth #3012 during the Securities Industry and Financial Markets Association (SIFMA) Technology Management Conference & Exhibit, June 19-21 in New York, NY, to learn more about the integration of Sybase Real-time Analytics Platform and StreamBase platforms.

About Sybase, Inc.
Sybase is the largest global enterprise software company exclusively focused on managing and mobilizing information from the data center to the point of action. Sybase provides open, cross-platform solutions that securely deliver information anytime, anywhere, enabling customers and partners to create an information edge. The world's most critical data in commerce, communications, finance, government and healthcare runs on Sybase. For more information, visit the Sybase Web site: http://www.sybase.com/.

####

Sybase is a registered trademark of Sybase, Inc. All other company and product names mentioned may be trademarks of the respective companies with which they are associated.

Special Note: Statements concerning Sybase’s future growth, prospects and new product releases are, by nature, forward-looking statements that involve a number of uncertainties and risks, and cannot be guaranteed. The words “anticipate,” “believe,” “estimate,” “expect,” “intend,” “will” and similar expressions relating to Sybase and its management may identify forward-looking statements. Such statements are intended to reflect Sybase’s current views with respect to future events and may ultimately prove to be incorrect or false. Factors that could cause actual events or results to differ materially include shifts in customer demand, rapid technology changes, competitive factors and unanticipated delays in scheduled product availability. These and other risks are detailed from time to time in Sybase’s Securities and Exchange Commission filings, including, but not limited to, its annual report on Form 10-K and its quarterly reports on Form 10-Q (copies of which can be viewed on Sybase’s Web site).

Ruth Busbee
Citigate Cunningham
(415) 618-8739
rbusbee@cunningham.com

*******************************************


Jim