See Duane Nickull’s post and Anne Thomas Manes’ musings on the same.
First off, I started reading into this because I wasn’t, and still am not, clear on why Nickull is using the term “forensic” in this context. He refers to “forensic architecture” as “the process of describing the architecture of something after it has been built,” as if this is a non-judgmental effort. But he then uses it to refer to his dissection of various failed, ineffectual, and/or underwhelming SOA standardization efforts in which he was involved: ebXML, W3C Web Services Architecture Working Group, UN/CEFACT eBusiness Architecture, and OASIS Reference Model for SOA. But it becomes clear, in his analysis, that he’s actually deploying the term “forensic” in the standard negative connotation of post-mortem (on a victim) and building of an evidence-based case for prosecution of perpetrators.
Though, fortunately, Nickull doesn’t lay it on that heavy. And he provides a good analysis of what went wrong and lessons learned from those various efforts. But, reading into this, it’s clear to me that he’s primarily critiquing the applicability of the software development life cycle “waterfall methodology” to committee-based development of standards in sprawling, ill-defined architectural initiatives--of which SOA is perhaps a classic case in point.
What his analysis points to is the value of a retrospective approach to clarifying the core design principles of an emergent architectural phenomenon that simply works--such as the Web, with REST as perhaps the textbook premier example of a “principles clarification” exercise. In contrast to the “waterfall” method, I’d call this the “salmon swimming upstream to reconceptualize in their presumed/intuited spawning place” approach. Or maybe simply the “salmon” methodology.
Which reminds me of a point I need to make. For the past several years, I’ve been focusing on a space--business intelligence (BI), data warehousing (DW), and data integration (DI)--in which SOA (however defined and standardized) has had just a minimal footprint, and primarily as one integration approach in the back-end. But BI/DW/DI continues to grow and innovate at an amazing clip, still hinging on an old, stable, universal standard: SQL (with SOA-ish XQuery/XPath not achieving any significant momentum swimming upstream against this powerful current).
Interestingly, there are few if any industry specification activities in any forum that involve the BI/DW/DI segment. Much of the front-end BI innovation revolves around integrating Web 2.0-style interfaces and services, and much of that relies on REST (the non-architected architecture that has totally eclipsed SOA in real-world adoption).
REST-ive salmon continue to spawn like crazy downstream in the BI and analytics market. Look, for example, at my latest Forrester Information and Knowledge Management blogpost on the next-generation OLAP “Project Gemini” features that Microsoft began demonstrating publicly this week. Not much SOA in this approach, but a hefty dose of REST, thanks to its tight integration with Microsoft’s Sharepoint portal/collaboration platform, and a lot of SQL, owing to integration with SQL Server Analysis Services.
By the way, people who read James Kobielus’ blog may or may not realize that I now put most of my tech-meaty musings under Forrester’s I&KM blog. Please plug that blog into your reader. I’m one of many Forrester analysts who post to that regularly. Seriously great stuff, all of it.
And you thought all I do nowadays is write pretentious poetry.
Pretentious? Moi? Au contraire, mon frere.