USWORD
Know me by my most
mnemonic non-mnemonic
name--OK?
My most tangled chain
of digit-twisting and one-
off emoticons.
Whatever suggests
me to you and us to no
one overlooking.
Thursday, March 31, 2005
Monday, March 28, 2005
poem End of March
END OF MARCH
In luxurious gloom.
In carbonated rain.
The lamb stands shivering.
Shakes the
remains of
her late last
wintering.
In luxurious gloom.
In carbonated rain.
The lamb stands shivering.
Shakes the
remains of
her late last
wintering.
Sunday, March 27, 2005
poem Aquifer
AQUIFER
Set. More massively
installed than the stone
beneath the weight of
the world's monuments.
As thoroughly stained
in sediment as
earth's running rivers
and God's ancient veins.
Streaked with the runoff
of daily rains and
fresh capillaries
of rust and red salt.
Set. More massively
installed than the stone
beneath the weight of
the world's monuments.
As thoroughly stained
in sediment as
earth's running rivers
and God's ancient veins.
Streaked with the runoff
of daily rains and
fresh capillaries
of rust and red salt.
Friday, March 25, 2005
fyi Could Ajax Wash Away 'Smart Clients'?
All:
Pointer to article:
http://www.microsoft-watch.com/article2/0,1995,1777009,00.asp
Kobielus kommentary:
Much as I hate to see an unnecessary new acronym enter our lingo, I think “AJAX” (asynchronous JavaScript and XHTML) is preferable to “RIA” (rich Internet applications). They both refer to the same phenomenon—the enriched browser-oriented presentation tier--but AJAX references open industry standards, whereas RIA (the older of the two acronyms) carries the unpleasant connotation of partially proprietary approaches from vendors such as Macromedia, Nexaweb, and Microsoft.
Of course, AJAX and RIA aren’t the first terms to be applied to this phenomenon (the enriched Web presentation tier). Back in the day (just a few years ago), a lot of attention was paid to “Dynamic HTML.” Now the new twist is that more and more of the content being interchanged within the presentation tier (between clients and servers) is XML, not HTML. And it’s in the XML vocabulary—XHTML—that has superseded HTML 4.0 as the evolution path for Web presentation markup standards. As the article points out, AJAX is a recently coined acronym that loosely refers to any and all of the following:
• standards-based presentation using XHTML and Cascading Style Sheets (CSS);
• dynamic display and interaction using the Document Object Model;
• data interchange and manipulation using XML and XSLT;
• asynchronous data retrieval using XMLHttpRequest;
• JavaScript binding everything together.
Of course, AJAX can’t scrub the grubby pipes of the Web presentation tier overnight and get it all spotless and pristine. As an approach, it will need to coexist with partially proprietary approaches such as Microsoft’s XAML-based “Avalon” and Macromedia’s MXML-based Flex. I’m publishing an article next month in Business Communications Review that discusses the various Web presentation tier approaches—including RIA, thin client, Windows terminal emulation, and so forth—and am quite aware of the glut of presentation approaches that are crowding this space.
It’s not an issue of AJAX vs. smart client in the Web presentation tier. Every client these days is smart to varying degrees. It’s more an issue of having client-, approach-, and protocol-agile presentation middleware tiers that can support thick client, thin client (Windows terminal emulation, of which there are several approaches), basic browser, and enriched browser (RIA, of which there are several approaches, plus AJAX/DHTML). Oh…I didn’t mention the need for presentation tiers that can do equal justice to PC, laptop, and mobile/handheld browser/clients.
Yeesh…what a mess.
Jim
Pointer to article:
http://www.microsoft-watch.com/article2/0,1995,1777009,00.asp
Kobielus kommentary:
Much as I hate to see an unnecessary new acronym enter our lingo, I think “AJAX” (asynchronous JavaScript and XHTML) is preferable to “RIA” (rich Internet applications). They both refer to the same phenomenon—the enriched browser-oriented presentation tier--but AJAX references open industry standards, whereas RIA (the older of the two acronyms) carries the unpleasant connotation of partially proprietary approaches from vendors such as Macromedia, Nexaweb, and Microsoft.
Of course, AJAX and RIA aren’t the first terms to be applied to this phenomenon (the enriched Web presentation tier). Back in the day (just a few years ago), a lot of attention was paid to “Dynamic HTML.” Now the new twist is that more and more of the content being interchanged within the presentation tier (between clients and servers) is XML, not HTML. And it’s in the XML vocabulary—XHTML—that has superseded HTML 4.0 as the evolution path for Web presentation markup standards. As the article points out, AJAX is a recently coined acronym that loosely refers to any and all of the following:
• standards-based presentation using XHTML and Cascading Style Sheets (CSS);
• dynamic display and interaction using the Document Object Model;
• data interchange and manipulation using XML and XSLT;
• asynchronous data retrieval using XMLHttpRequest;
• JavaScript binding everything together.
Of course, AJAX can’t scrub the grubby pipes of the Web presentation tier overnight and get it all spotless and pristine. As an approach, it will need to coexist with partially proprietary approaches such as Microsoft’s XAML-based “Avalon” and Macromedia’s MXML-based Flex. I’m publishing an article next month in Business Communications Review that discusses the various Web presentation tier approaches—including RIA, thin client, Windows terminal emulation, and so forth—and am quite aware of the glut of presentation approaches that are crowding this space.
It’s not an issue of AJAX vs. smart client in the Web presentation tier. Every client these days is smart to varying degrees. It’s more an issue of having client-, approach-, and protocol-agile presentation middleware tiers that can support thick client, thin client (Windows terminal emulation, of which there are several approaches), basic browser, and enriched browser (RIA, of which there are several approaches, plus AJAX/DHTML). Oh…I didn’t mention the need for presentation tiers that can do equal justice to PC, laptop, and mobile/handheld browser/clients.
Yeesh…what a mess.
Jim
Thursday, March 24, 2005
fyi “Tensions on the Web”—and the attention paid to Collaboration
All:
Pointer to blogpost:
http://www.adambosworth.net/archives/000041.html
Kobielus kommentary:
A very thoughtful post by Adam Bosworth of Google (formerly BEA), all around search, taxonomy/ontologies, web services, privacy, security, managed environments, and collaboration.
But the larger framing topic is blogs—in particular, the tensions that one of his recent blog postings created, inadvertently, for him at Google—and his need to continually stress that his blog doesn’t necessarily reflect the official positions of his employer. That’s a tough balance to hit, and any of us who work for other people are always at pains to do so. I’ve been writing Network World columns for 17+ years, and always with my byline ending with “The opinions expressed are his own.” Which has usually been understood without much confusion, since I’ve rarely had any significant clout or say in the operation of the businesses that have chosen to pay me a salary. Being powerless has always motivated me to make my thoughts and expressions all the more forceful. Since this blog got started during a period of unemployment, I haven’t had much to lose (except my good name, people’s respect, and any possibility of future employment—trifling matters, really) by posting anything that entered my cranium. The alternatives are not to blog, or to blog without expressing anything even remotely interesting, daring, or controversial. I don’t care for either of those alternatives.
In this post, Bosworth was discussing the exciting developments with “folksonomies,” which are essentially semantic web environments within which anybody can post whatever tag-sets they like, in a bottom-up fashion, without draconian taxonomy/ontology-nazis telling them what tags are legal or illegal in a particular domain (analogous to how wikis operate in terms of free-for-all post/overpost concurrent authoring/editing). Bosworth acknowledges the inherent sloppiness of this sort of distributed collaboration, but places his faith in the power of team dynamics to quickly converge on a consensus, good-enough tag-set taxonomy/ontology for whatever domain they’re defining. And to evolve, police, defend, regulate, and restore whatever (presumably worthy/benign) collective artifact gets assembled through these dynamics.
All of which got me to thinking about how increasingly we’re all placing our faith in this new sort of “invisible hand”—the collaboration environment—to design online phenomena through emergent interactions, rather than trust/empower any one real-person uber-author. And it is a faith—not a science or art. Here’s how Bosworth articulates his personal faith: “I've always believed in the twin values of rationalism and humanism, but humanism has often felt as though it got short shrift in our community. In this world, it's all about people and belonging and working with others.”
Nothing wrong with those ideals. But Bosworth and others who tout the grand promise of Collaboration (I’ll capitalize this word for the rest of this post, just to zoom it for your consideration) usually treat it as something that magically emerges from some constellation of tools/services: blogs, wikis, IM/presence, P2P, e-mail, message boards, discussion groups, RSS feeds, knowledge management, shared spaces, portals, search, VoIP, etc.
And nothing wrong with those tools, in the abstract. I use most of those tools/services fairly regularly. But it’s always felt to me that IT professionals have always been looking for some Collaboration killer-app, or priestly class of killer-apps, as if therein lies nirvana, if only we can initiate and educate the benighted masses to the wonder of it all.
Excuse my cynicism on this matter. But I’ve been in the IT industry for 20 years, and most of that time, I’ve been treated to one new Collaboration killer-app after another. Cluttering up my virtual desktop/workspace with all of these Collaboration killer-apps, I’m now convinced that all these killer-apps are someday going to kill me from the sheer weight of their o’erweening ambition.
A few posts ago, I (Kobielus) expressed exasperation with the sheer complexity of Microsoft’s Collaboration killer-apps: “I think Microsoft should take the good stuff from Groove—the P2P-based distributed file sharing/caching/synchronization features—and junk everything else. Microsoft should embed this core Groove technology into Office, Outlook, Windows Explorer, MSN Messenger, Live Communication Server, Internet Explorer, SharePoint Team Services, and the WiFi-based workgroup peer-LAN functionality within “Longhorn.” But present a rich-browser interface to it all (via XAML/”Avalon”) and radically simplify and converge the UI, integration, and management of all these scattered client and server components.”
I’ve felt that way for so long. Simplify and unclutter my Collaboration world, and I’ll be a happier camper. Which reminds me of a book I checked out of my local library a few months ago. It was called “Feng Shui Your Workspace for Dummies.” With a title like that, how could I not check it out?
It was interesting and useful, up to a point. Of course, it primarily addressed how to “feng shui” your physical workspace—your office. And, of course, it primarily addressed that topic within the context of 3000 years of Chinese astrology, numerology, metaphysics, superstition, and so forth.
But, when you cut through all the distracting nonsense (personal note: my wife is Chinese, and I very much love Chinese art and culture), you come to the essential deep structure of “feng shui.” It’s simply the art of harmonious arrangement. As a set of principles and practices, it describes how to arrange your physical space so as to maximize harmony, comfort, and productivity between you, your artifacts, your colleagues, your surroundings, and the universe at large. As you might expect, “feng shui”ing your workspace involves a lot of de-cluttering, balancing, and arranging all the objects, textures, lighting, and so forth to make you feel happy, calm, composed, etc. A lot of common sense ideas, but expressed within a set of practical/philosophical guidelines. A key take-away: Evaluate your physical space according to how the eyes, attentions, and bodies of people (including yourself) naturally move throughout that space, and endeavor to make those movements more free and flowing without becoming robotic and repetitive.
I was sort of hoping that the book would have explained how to “feng shui” my virtual workspace as well. From personal experience, that’s something I’ve rarely been able to do for very long: maximizing the harmony, comfort, and productivity of myself and everybody else I work/deal with regularly. Usually, given the hectic/ever-changing nature of my jobs and the industry and the world, disharmony and disequilibrium are more often the norm. How can I “feng shui” a world that’s splaying awkwardly in all directions? I can barely “feng shui” my head and heart in harmony with each other when the world does a number on me.
I don’t have any great hope for an ideal Collaboration environment. And I don’t place inordinate hopes in Collaboration killer-apps like Groove, blogs, wikis, etc. And I don’t expect that people, work, or the world will get any simpler or easier to Collaborate with. Sloppiness is the way of the world, when the world’s evolving out of control.
But I’ve found a way to “feng shui” the ever-messier Collaboration ecosystem, at least in my head. Ever since I covered the young Lotus Notes for Network World, in an article in 1990, I’ve been elaborating my own personal interpretive framework for all this. It found first expression in that Network World article, then in an article I did for Rapport Messaging Review in 1996, then in my “Workflow Strategies” book in 1997, and then in my Burton Group reports in 1998-2004. (Actually, now that I recall, I started developing my framework in 1987, when I was a research associate at the International Center for Information Technologies, and was exposed to Action Technologies' Fernando Flores, their Communicator structured e-mail program, and their new-agey Collaboration philosophy. But I digress).
Here now is my Collaboration conceptual framework:
• Collaboration involves sustaining a protocol for effective interaction among people and organizations: a framework for interaction between networked individuals to exchange information and manage work successfully.
• Collaboration in the business world often depends on well-defined but adaptable protocols expressed as recurring processes or workflows.
• In a dynamic world, the accent is on process adaptability, with fast-changing conditions requiring that people reorganize and improvise on the fly, using whatever resources are at their disposal to meet whatever new challenges emerge.
• The ideal enterprise collaborative environment must serve both the present (any existing process/protocol) and the foreseeable future (any process/protocol that is likely to emerge), with a clear emphasis on flexibility and adaptability to support new business goals, policies, and activities. This focus places the emphasis squarely on collaborative environments that support multifunctional, integrated, application suites and integrate with services supported on legacy network platforms (NOSs), but increasingly emphasize integration with services and standards running over the new world platform (the Internet and World Wide Web). The collaborative environment should also support access to all information and functionality from an integrated, universal client environment (browser-based applications).
• We define collaborative processes as taking place within a three-dimensional collaborative “space,” consisting of the dimensions of platform, structure, and media. The collaborative space supports the context and substance of interaction between users scattered across the map, time zones, projects, and organizations.
• Collaborative platform refers to the geographic, physical, and technological environment in which work is performed--in other words, the means of production and distribution. The dimensions of the collaborative platform include user terminals and application software, operating environments, networks, geographic range, and mobility. The collaborative platform incorporates the “network services model” of core enterprise services (file, print, directory, security, messaging, web, management), as well as the physical infrastructure supporting these services.
• Collaborative structure consists of the organizational apparatus and controls used to define, coordinate, and track business processes. In a company’s information technology environment, the collaborative structure resides in the sum total of automated information systems that implement and enforce organizational controls. The collaborative structure in any organization is constrained by the presence or lack of appropriate directory, security, and management functionality in the underlying network/systems platform. Structure-oriented collaborative applications fall into three main categories: time management, workflow management, and project management. We characterize these applications as “high structure” in emphasis. High-structure applications are designed to directly support threads or protocols for interpersonal coordination within the management “superstructure” within which most coordination takes place: work schedules, operating procedures, and organizational charts, as well as the associated task breakdowns and assignments.
• Collaborative media consist of the work products and all raw and semi-finished materials--including information and communications inputs--used to give the product shape, substance, and coherence. Media are the things that flow in a business process, within the context of a technological platform and organizational structure. Media fall into three general categories: stores, threads, and meetings. Media-oriented collaborative applications fall into three categories: information-base sharing, messaging, and conferencing. We characterize these applications as “high media” in emphasis. The best way to characterize high-media collaboration environments is by the range of information stores that they incorporate, integrate, and access. Indeed, one can regard Internet-based high-media collaboration environments as consisting of three layers of overlapping information stores, corresponding to the three principal application tiers in an enterprise network: collaboration, intranet, and NOS. A truly comprehensive Internet/intranet collaboration environment would include the following information stores: file, document, data, directory, message, newsgroup, image, webpage, and object.
• Traditional database-oriented corporate applications are “low-structure” in emphasis, since they usually do not directly support interpersonal coordination but nonetheless embody or enforce impersonal corporate policies and procedures. Examples of low-structure applications, according to this definition, include mission-critical applications in support of payroll, order processing, and customer service. Indeed, most enterprise applications fall into the “low-structure” category, imposing pervasive controls on operations by defining or embedding these controls within the underlying information stores.
• Many of the underlying authentication, access, and confidentiality controls used by both high- and low-structure applications are defined within the enterprise NOS, directory, security, and management services. Consequently, the enterprise directory service represents the pivotal piece of infrastructure for enabling both high- and low-structure applications
• A well-rounded collaboration (or “groupware”) environment is one that provides a broad range and smooth blend of high-media and high-structure functionality. Groupware point products, by contrast, are those that emphasize one or the other type of functionality, and often a subset within one of these categories (e.g., any of myriad specialized messaging, conferencing, calendaring, or workflow products).
Yeah, yeah, I know, that's messy as hell. Not much harmony for the uninitiated. But such are most Collaboration paradigms. If they get too simple, they get too simplistic. Or too fascistic.
So here's any even simpler version of all that: Any collaboration space can be characterized as meetings, messages, materials, teams, times, tasks, and flows.
Or, even simpler: A collaboration space is stuff that needs doing, and the stuff it gets done with.
Oh well, it's "feng shui" harmonious for me, at the very least. It keeps coming to mind. It helps me clear up my personal confusion. And classify every new Collaboration tool.
And get through my Collaboration-crazy days.
Jim
Pointer to blogpost:
http://www.adambosworth.net/archives/000041.html
Kobielus kommentary:
A very thoughtful post by Adam Bosworth of Google (formerly BEA), all around search, taxonomy/ontologies, web services, privacy, security, managed environments, and collaboration.
But the larger framing topic is blogs—in particular, the tensions that one of his recent blog postings created, inadvertently, for him at Google—and his need to continually stress that his blog doesn’t necessarily reflect the official positions of his employer. That’s a tough balance to hit, and any of us who work for other people are always at pains to do so. I’ve been writing Network World columns for 17+ years, and always with my byline ending with “The opinions expressed are his own.” Which has usually been understood without much confusion, since I’ve rarely had any significant clout or say in the operation of the businesses that have chosen to pay me a salary. Being powerless has always motivated me to make my thoughts and expressions all the more forceful. Since this blog got started during a period of unemployment, I haven’t had much to lose (except my good name, people’s respect, and any possibility of future employment—trifling matters, really) by posting anything that entered my cranium. The alternatives are not to blog, or to blog without expressing anything even remotely interesting, daring, or controversial. I don’t care for either of those alternatives.
In this post, Bosworth was discussing the exciting developments with “folksonomies,” which are essentially semantic web environments within which anybody can post whatever tag-sets they like, in a bottom-up fashion, without draconian taxonomy/ontology-nazis telling them what tags are legal or illegal in a particular domain (analogous to how wikis operate in terms of free-for-all post/overpost concurrent authoring/editing). Bosworth acknowledges the inherent sloppiness of this sort of distributed collaboration, but places his faith in the power of team dynamics to quickly converge on a consensus, good-enough tag-set taxonomy/ontology for whatever domain they’re defining. And to evolve, police, defend, regulate, and restore whatever (presumably worthy/benign) collective artifact gets assembled through these dynamics.
All of which got me to thinking about how increasingly we’re all placing our faith in this new sort of “invisible hand”—the collaboration environment—to design online phenomena through emergent interactions, rather than trust/empower any one real-person uber-author. And it is a faith—not a science or art. Here’s how Bosworth articulates his personal faith: “I've always believed in the twin values of rationalism and humanism, but humanism has often felt as though it got short shrift in our community. In this world, it's all about people and belonging and working with others.”
Nothing wrong with those ideals. But Bosworth and others who tout the grand promise of Collaboration (I’ll capitalize this word for the rest of this post, just to zoom it for your consideration) usually treat it as something that magically emerges from some constellation of tools/services: blogs, wikis, IM/presence, P2P, e-mail, message boards, discussion groups, RSS feeds, knowledge management, shared spaces, portals, search, VoIP, etc.
And nothing wrong with those tools, in the abstract. I use most of those tools/services fairly regularly. But it’s always felt to me that IT professionals have always been looking for some Collaboration killer-app, or priestly class of killer-apps, as if therein lies nirvana, if only we can initiate and educate the benighted masses to the wonder of it all.
Excuse my cynicism on this matter. But I’ve been in the IT industry for 20 years, and most of that time, I’ve been treated to one new Collaboration killer-app after another. Cluttering up my virtual desktop/workspace with all of these Collaboration killer-apps, I’m now convinced that all these killer-apps are someday going to kill me from the sheer weight of their o’erweening ambition.
A few posts ago, I (Kobielus) expressed exasperation with the sheer complexity of Microsoft’s Collaboration killer-apps: “I think Microsoft should take the good stuff from Groove—the P2P-based distributed file sharing/caching/synchronization features—and junk everything else. Microsoft should embed this core Groove technology into Office, Outlook, Windows Explorer, MSN Messenger, Live Communication Server, Internet Explorer, SharePoint Team Services, and the WiFi-based workgroup peer-LAN functionality within “Longhorn.” But present a rich-browser interface to it all (via XAML/”Avalon”) and radically simplify and converge the UI, integration, and management of all these scattered client and server components.”
I’ve felt that way for so long. Simplify and unclutter my Collaboration world, and I’ll be a happier camper. Which reminds me of a book I checked out of my local library a few months ago. It was called “Feng Shui Your Workspace for Dummies.” With a title like that, how could I not check it out?
It was interesting and useful, up to a point. Of course, it primarily addressed how to “feng shui” your physical workspace—your office. And, of course, it primarily addressed that topic within the context of 3000 years of Chinese astrology, numerology, metaphysics, superstition, and so forth.
But, when you cut through all the distracting nonsense (personal note: my wife is Chinese, and I very much love Chinese art and culture), you come to the essential deep structure of “feng shui.” It’s simply the art of harmonious arrangement. As a set of principles and practices, it describes how to arrange your physical space so as to maximize harmony, comfort, and productivity between you, your artifacts, your colleagues, your surroundings, and the universe at large. As you might expect, “feng shui”ing your workspace involves a lot of de-cluttering, balancing, and arranging all the objects, textures, lighting, and so forth to make you feel happy, calm, composed, etc. A lot of common sense ideas, but expressed within a set of practical/philosophical guidelines. A key take-away: Evaluate your physical space according to how the eyes, attentions, and bodies of people (including yourself) naturally move throughout that space, and endeavor to make those movements more free and flowing without becoming robotic and repetitive.
I was sort of hoping that the book would have explained how to “feng shui” my virtual workspace as well. From personal experience, that’s something I’ve rarely been able to do for very long: maximizing the harmony, comfort, and productivity of myself and everybody else I work/deal with regularly. Usually, given the hectic/ever-changing nature of my jobs and the industry and the world, disharmony and disequilibrium are more often the norm. How can I “feng shui” a world that’s splaying awkwardly in all directions? I can barely “feng shui” my head and heart in harmony with each other when the world does a number on me.
I don’t have any great hope for an ideal Collaboration environment. And I don’t place inordinate hopes in Collaboration killer-apps like Groove, blogs, wikis, etc. And I don’t expect that people, work, or the world will get any simpler or easier to Collaborate with. Sloppiness is the way of the world, when the world’s evolving out of control.
But I’ve found a way to “feng shui” the ever-messier Collaboration ecosystem, at least in my head. Ever since I covered the young Lotus Notes for Network World, in an article in 1990, I’ve been elaborating my own personal interpretive framework for all this. It found first expression in that Network World article, then in an article I did for Rapport Messaging Review in 1996, then in my “Workflow Strategies” book in 1997, and then in my Burton Group reports in 1998-2004. (Actually, now that I recall, I started developing my framework in 1987, when I was a research associate at the International Center for Information Technologies, and was exposed to Action Technologies' Fernando Flores, their Communicator structured e-mail program, and their new-agey Collaboration philosophy. But I digress).
Here now is my Collaboration conceptual framework:
• Collaboration involves sustaining a protocol for effective interaction among people and organizations: a framework for interaction between networked individuals to exchange information and manage work successfully.
• Collaboration in the business world often depends on well-defined but adaptable protocols expressed as recurring processes or workflows.
• In a dynamic world, the accent is on process adaptability, with fast-changing conditions requiring that people reorganize and improvise on the fly, using whatever resources are at their disposal to meet whatever new challenges emerge.
• The ideal enterprise collaborative environment must serve both the present (any existing process/protocol) and the foreseeable future (any process/protocol that is likely to emerge), with a clear emphasis on flexibility and adaptability to support new business goals, policies, and activities. This focus places the emphasis squarely on collaborative environments that support multifunctional, integrated, application suites and integrate with services supported on legacy network platforms (NOSs), but increasingly emphasize integration with services and standards running over the new world platform (the Internet and World Wide Web). The collaborative environment should also support access to all information and functionality from an integrated, universal client environment (browser-based applications).
• We define collaborative processes as taking place within a three-dimensional collaborative “space,” consisting of the dimensions of platform, structure, and media. The collaborative space supports the context and substance of interaction between users scattered across the map, time zones, projects, and organizations.
• Collaborative platform refers to the geographic, physical, and technological environment in which work is performed--in other words, the means of production and distribution. The dimensions of the collaborative platform include user terminals and application software, operating environments, networks, geographic range, and mobility. The collaborative platform incorporates the “network services model” of core enterprise services (file, print, directory, security, messaging, web, management), as well as the physical infrastructure supporting these services.
• Collaborative structure consists of the organizational apparatus and controls used to define, coordinate, and track business processes. In a company’s information technology environment, the collaborative structure resides in the sum total of automated information systems that implement and enforce organizational controls. The collaborative structure in any organization is constrained by the presence or lack of appropriate directory, security, and management functionality in the underlying network/systems platform. Structure-oriented collaborative applications fall into three main categories: time management, workflow management, and project management. We characterize these applications as “high structure” in emphasis. High-structure applications are designed to directly support threads or protocols for interpersonal coordination within the management “superstructure” within which most coordination takes place: work schedules, operating procedures, and organizational charts, as well as the associated task breakdowns and assignments.
• Collaborative media consist of the work products and all raw and semi-finished materials--including information and communications inputs--used to give the product shape, substance, and coherence. Media are the things that flow in a business process, within the context of a technological platform and organizational structure. Media fall into three general categories: stores, threads, and meetings. Media-oriented collaborative applications fall into three categories: information-base sharing, messaging, and conferencing. We characterize these applications as “high media” in emphasis. The best way to characterize high-media collaboration environments is by the range of information stores that they incorporate, integrate, and access. Indeed, one can regard Internet-based high-media collaboration environments as consisting of three layers of overlapping information stores, corresponding to the three principal application tiers in an enterprise network: collaboration, intranet, and NOS. A truly comprehensive Internet/intranet collaboration environment would include the following information stores: file, document, data, directory, message, newsgroup, image, webpage, and object.
• Traditional database-oriented corporate applications are “low-structure” in emphasis, since they usually do not directly support interpersonal coordination but nonetheless embody or enforce impersonal corporate policies and procedures. Examples of low-structure applications, according to this definition, include mission-critical applications in support of payroll, order processing, and customer service. Indeed, most enterprise applications fall into the “low-structure” category, imposing pervasive controls on operations by defining or embedding these controls within the underlying information stores.
• Many of the underlying authentication, access, and confidentiality controls used by both high- and low-structure applications are defined within the enterprise NOS, directory, security, and management services. Consequently, the enterprise directory service represents the pivotal piece of infrastructure for enabling both high- and low-structure applications
• A well-rounded collaboration (or “groupware”) environment is one that provides a broad range and smooth blend of high-media and high-structure functionality. Groupware point products, by contrast, are those that emphasize one or the other type of functionality, and often a subset within one of these categories (e.g., any of myriad specialized messaging, conferencing, calendaring, or workflow products).
Yeah, yeah, I know, that's messy as hell. Not much harmony for the uninitiated. But such are most Collaboration paradigms. If they get too simple, they get too simplistic. Or too fascistic.
So here's any even simpler version of all that: Any collaboration space can be characterized as meetings, messages, materials, teams, times, tasks, and flows.
Or, even simpler: A collaboration space is stuff that needs doing, and the stuff it gets done with.
Oh well, it's "feng shui" harmonious for me, at the very least. It keeps coming to mind. It helps me clear up my personal confusion. And classify every new Collaboration tool.
And get through my Collaboration-crazy days.
Jim
Tuesday, March 22, 2005
fyi France Plans Net 'Counter-Offensive'
All:
Pointer to article:
http://www.newsfactor.com/story.xhtml?story_id=31513
Kobielus kommentary:
I love the new word mentioned in this article: “omnigooglization.” And I’m happy for the French that they’re committed to putting the core of their literary legacy online for posterity, and to be swept up into the omnigooglizing cyberverse.
I seriously doubt that the French language is endangered, either as a written or spoken tongue. Since it’s written in a Latin character set that’s pretty close to English, more people will have online exposure to French-language texts and will roughly familiarize themselves with the tongue (for rudimentary reading, if not speaking or writing). In fact, the same phenomenon will buoy any tongue written in a phonetic alphabet derived from Latin, as long as there’s a significant amount of online literature for that language. We have to face the fact that most people can vocalize the basic Latin alphabet, and, hence, will be able to hack a rudimentary understanding of any tongue based on it. That includes not just the western European languages, but also such non-European tongues as Vietnamese and Indonesian.
However, any literature that isn’t written in a Latin-derived alphabet (e.g., Greek, Russian, Hindi, Arabic, Mandarin, Japanese) will tend to be shunned or ignored by non-speakers who find it much too difficult to hack their basic character sets. That’s not to say that native speakers of these languages won’t continue to teach them to their children, use them in daily life, and produce wonderful literary works in them. It’s just that English has burned a Latinate character set into most people’s brains.
And that character set is the true world “lingua franca” (a term, and a phenomenon, that should warm of cockles of Jacques Chirac’s heart). My hunch is that, in 500 years’ time, the world lingua franca will be a creolized mixture of English, Spanish, French, and Malayo-Indonesian. I’ve selected those four languages based on the fact that they, in that descending order, have the most native speakers in 2005 for their tongues (all of which use the Latin character sets). If the speakers of some other major world language, such as Mandarin or Arabic, adopt a Latin-derived character set, they may also get swept into this creolizing world tongue. Otherwise, their literature will be purely opaque to most non-speakers, regardless of how much of it is posted to the cyberverse.
Jim
Pointer to article:
http://www.newsfactor.com/story.xhtml?story_id=31513
Kobielus kommentary:
I love the new word mentioned in this article: “omnigooglization.” And I’m happy for the French that they’re committed to putting the core of their literary legacy online for posterity, and to be swept up into the omnigooglizing cyberverse.
I seriously doubt that the French language is endangered, either as a written or spoken tongue. Since it’s written in a Latin character set that’s pretty close to English, more people will have online exposure to French-language texts and will roughly familiarize themselves with the tongue (for rudimentary reading, if not speaking or writing). In fact, the same phenomenon will buoy any tongue written in a phonetic alphabet derived from Latin, as long as there’s a significant amount of online literature for that language. We have to face the fact that most people can vocalize the basic Latin alphabet, and, hence, will be able to hack a rudimentary understanding of any tongue based on it. That includes not just the western European languages, but also such non-European tongues as Vietnamese and Indonesian.
However, any literature that isn’t written in a Latin-derived alphabet (e.g., Greek, Russian, Hindi, Arabic, Mandarin, Japanese) will tend to be shunned or ignored by non-speakers who find it much too difficult to hack their basic character sets. That’s not to say that native speakers of these languages won’t continue to teach them to their children, use them in daily life, and produce wonderful literary works in them. It’s just that English has burned a Latinate character set into most people’s brains.
And that character set is the true world “lingua franca” (a term, and a phenomenon, that should warm of cockles of Jacques Chirac’s heart). My hunch is that, in 500 years’ time, the world lingua franca will be a creolized mixture of English, Spanish, French, and Malayo-Indonesian. I’ve selected those four languages based on the fact that they, in that descending order, have the most native speakers in 2005 for their tongues (all of which use the Latin character sets). If the speakers of some other major world language, such as Mandarin or Arabic, adopt a Latin-derived character set, they may also get swept into this creolizing world tongue. Otherwise, their literature will be purely opaque to most non-speakers, regardless of how much of it is posted to the cyberverse.
Jim
fyi Microsoft considering WinFS support in Windows XP
All:
Pointer to article:
http://www.computerworld.com/softwaretopics/os/windows/story/0,10801,100387,00.html?source=NLT_WK&nid=100387
Kobielus kommentary:
It’s very likely that Microsoft eventually support all principal “Longhorn” technologies--“Avalon,” WinFS, “Indigo,” and WinFX APIs--on its most recent “legacy” OSs: WinXP and WinSvr2003. If and when that comes to pass, Microsoft will have taken the next big step in the direction of pure OS virtualization. When you abstract the external interface from the internal implementation of any functionality, you’ve virtualized that functionality. If all “Longhorn” external interfaces have been abstracted from the underlying kernel implementations on which they run (WinXP kernel, Win2003 kernel, “Longhorn” kernel etc.), then “Longhorn” is not a traditional OS anymore. It becomes a set of virtualization technologies that bridge the legacy Windows to the future Windows. And which, conceivably, through open source implementations such as Mono, can bridge the Windows and Linux worlds.
At that point, Microsoft will have a harder time convincing users to upgrade to the “Longhorn” kernel, if they can get the same functionality by simply retrofitting their existing OS. Virtualization breaks the upgrade cycle upon which platform and application software vendors’ licensing revenues greatly depend.
Jim
Pointer to article:
http://www.computerworld.com/softwaretopics/os/windows/story/0,10801,100387,00.html?source=NLT_WK&nid=100387
Kobielus kommentary:
It’s very likely that Microsoft eventually support all principal “Longhorn” technologies--“Avalon,” WinFS, “Indigo,” and WinFX APIs--on its most recent “legacy” OSs: WinXP and WinSvr2003. If and when that comes to pass, Microsoft will have taken the next big step in the direction of pure OS virtualization. When you abstract the external interface from the internal implementation of any functionality, you’ve virtualized that functionality. If all “Longhorn” external interfaces have been abstracted from the underlying kernel implementations on which they run (WinXP kernel, Win2003 kernel, “Longhorn” kernel etc.), then “Longhorn” is not a traditional OS anymore. It becomes a set of virtualization technologies that bridge the legacy Windows to the future Windows. And which, conceivably, through open source implementations such as Mono, can bridge the Windows and Linux worlds.
At that point, Microsoft will have a harder time convincing users to upgrade to the “Longhorn” kernel, if they can get the same functionality by simply retrofitting their existing OS. Virtualization breaks the upgrade cycle upon which platform and application software vendors’ licensing revenues greatly depend.
Jim
fyi The Rise of Smart Buildings
All:
Pointer to article:
http://www.computerworld.com/networkingtopics/networking/story/0,10801,100318,00.html?source=NLT_WK&nid=100318
Kobielus kommentary:
Building automation control (BAC) systems are an obvious application of the “identity of things,” right there alongside RFID and identity dataweb in potential ubiquity. It’s good to see that OASIS’ Open Building Information Exchange (OBIX) TC is composing their specs with WSRF, WSDM, and other critical WS-* standards.
As regards BAC and the identity of things, the following article excerpt should raise alarm bells:
• “Security is a problem at multiple levels, says Toby Considine, chairman of the OASIS OBIX committee. Control system manufacturers have rudimentary password security mechanisms, but most have ‘no concept of directory-enabled security,’ he says.”
Hmmm…does that mean there’s no directory-centric device authentication, or multifactor facility administrator authentication on many BAC systems? If that’s so, then we’re opening our buildings to all manner of imposters masquerading as facility administrators, custodians, etc. And, lacking directory-centric device authentication, wouldn’t it be possible to plug in rogue automated doorlocks, security cameras, and so forth in many BAC systems and have them go unauthenticated and undetected? And we’re worrying about people bringing cellular cameraphones into offices! Imagine the even greater damage that can be done by having rogue fixed cameras and microphones in your offices 7x24, aimed at computer screens, eavesdropping on conversations, etc.
Actually, I’d prefer that my BAC systems continue to function as separate technology silos until these authentication/directory issues are addressed by BAC and IT vendors. As we’ve learned from the Internet, spyware, spam, viruses, etc, exposure is the flipside of interoperability. I’m not all that concerned with whether we can control HVAC systems room by room. Businesspeople don’t lie awake at night worrying about the office thermostat.
Jim
Pointer to article:
http://www.computerworld.com/networkingtopics/networking/story/0,10801,100318,00.html?source=NLT_WK&nid=100318
Kobielus kommentary:
Building automation control (BAC) systems are an obvious application of the “identity of things,” right there alongside RFID and identity dataweb in potential ubiquity. It’s good to see that OASIS’ Open Building Information Exchange (OBIX) TC is composing their specs with WSRF, WSDM, and other critical WS-* standards.
As regards BAC and the identity of things, the following article excerpt should raise alarm bells:
• “Security is a problem at multiple levels, says Toby Considine, chairman of the OASIS OBIX committee. Control system manufacturers have rudimentary password security mechanisms, but most have ‘no concept of directory-enabled security,’ he says.”
Hmmm…does that mean there’s no directory-centric device authentication, or multifactor facility administrator authentication on many BAC systems? If that’s so, then we’re opening our buildings to all manner of imposters masquerading as facility administrators, custodians, etc. And, lacking directory-centric device authentication, wouldn’t it be possible to plug in rogue automated doorlocks, security cameras, and so forth in many BAC systems and have them go unauthenticated and undetected? And we’re worrying about people bringing cellular cameraphones into offices! Imagine the even greater damage that can be done by having rogue fixed cameras and microphones in your offices 7x24, aimed at computer screens, eavesdropping on conversations, etc.
Actually, I’d prefer that my BAC systems continue to function as separate technology silos until these authentication/directory issues are addressed by BAC and IT vendors. As we’ve learned from the Internet, spyware, spam, viruses, etc, exposure is the flipside of interoperability. I’m not all that concerned with whether we can control HVAC systems room by room. Businesspeople don’t lie awake at night worrying about the office thermostat.
Jim
Tuesday, March 15, 2005
fyi IT's Role In The Technology-Based Economy
All:
Pointer to article:
http://www.systemsmanagementpipeline.com/shared/article/printablePipelineArticle.jhtml;jsessionid=GY55IDOXY2MTUQSNDBGCKH0CJUMEKJVN?articleId=60404700
Kobielus kommentary:
By way of preface to this commentary, I should note that one of my first jobs was as an economic analyst. My undergraduate degree was in economics, from the University of Michigan—Ann Arbor, where I was an honors student and wrote an award-winning thesis on competition in the television industry. Anybody who’s ever really known me knows that once I start covering a topic, I never stop. Which partly explains why I’m usually out of breath. I continue to try to pull my older thoughts forward into my next personal career stage. Per a recent “self” posting, I’ve long regarded my core career as a student of the economic development of mass media—such as, especially, the Internet/Web, which is the first wholly new mass medium since the advent of television 60 years ago.
Anyway, my fundamental feeling is that world economic development is driven primarily by the emergence of new networks, though sometimes at a considerable lag from when the network was first introduced. Go back historically and look at the power of irrigation networks (basis for civilization); maritime, riparian, railroad, automotive, and aviation networks (basis for international trade, exploration, conquest, and industry); and electrical, telegraph, telephone, television, radio, and computer networks (basis for information society). Each of these networks can be associated with one or more “long waves,” per Schumpeter, Kondratieff, and other economists.
Networks drive development. They provide new ways of connecting people to each other and to the resources of the earth and sky. As transaction costs drop, economic development soars. An article I read a few years ago put the right perspective on this all. It provided a broader strategic framework for understanding the transaction-cost impacts of Moore's Law ("Every 18 months, processing power doubles while cost holds constant."), Metcalfe's Law ("The usefulness, or utility, of a network equals the square of the number of users."), and Coase's Law ("Firms are created because the additional cost of organizing them is cheaper than the transaction costs involved when individuals conduct business with each other using the market.").
The bottom line on all this is that:
• Declining transaction costs spur economic activity (per Coase)
• Transaction costs in today's economy keep declining due to declining costs for fundamental resources (processing power, per Moore; network connectivity, per Metcalfe)
• Transactional cost reductions throughout the economy lower the barriers to new competitors, mergers, acquisitions, divestitures, alliances, process reengineering, disintermediation, and other structural innovations
• Transactional cost differentiation (per Michael Porter's competitive/strategic framework) is the most powerful strategic differentiating weapon of all, making and breaking whole companies, industries, sectors, and economies
The fundamental value of networks (and all connected resources) as a resource to the economy/society may not be entirely quantifiable. The best we can say is that networks reduce transaction costs and make new types of transactions possible. And that all of this drives such economic indicators as innovation, investment, employment, and productivity, which can be quantified and satisfy some basic human needs and wants on which we can all agree.
Actually, these "laws" are just rules of thumb for understanding some important phenomena and trends. None of them has any scientific weight as a theoretical assertion confirmed through controlled observation and peer review. It's hard to see what the economic basis for Moore's Law might be, or what the psychological basis is for Metcalfe's Law's equation of squared-humans with some metric of collectivity happiness (after all, squares aren't much fun to party with), or what the sociological basis is for Coase's Law's implicit notion that a firm requires at least two individuals operating as an economic unit (what about us freelance writers? aren't we firms unto ourselves?).
I get the sense that the late 90s dot-com run-up in the market was no fluke, and it wasn't irrational. Collectively, society had begun to appreciate the discounted present value of future economic growth to be stimulated by the new, ubiquitous network (Internet) and transaction model (e-commerce). People bid equities up commensurate with that value. What society didn't figure, though, is that the future value will be driven/realized largely by the successors (or the successors to the successors) to the e-commerce companies who emerged in the late 90s. General rule: In a fast growing but new and volatile industry landscape, success will primarily accrue to the successors.
Microsoft has proved itself the ultimate successor time and again in so many markets. In this context, we can phrase “Gates' Law” as follows: As technology improves, new types of transactions with Microsoft will become more cost-effective, feasible, and, in fact, inevitable (and its corollary: the share of world income flowing to Redmond will increase).
All of which points up the fact that we've never had a "new economy," to use a term favored by many in the media these past ten years. The economy still runs on transaction costs, which juice liquidity. So much of this liquidity now rides the wire, but it still represents the time-old bottom line of balance sheets, financial exposure, and business and personal risk. In another article, Michael Porter and others debate whether there ever was a "new economy" immune to the historical cycles of capitalism. I like Porter's reductive comparison of the Internet to supermarket scanners (though others felt that was a ridiculous comparison): both innovations have contributed to a reduction in transaction costs, by supporting greater speed, flexibility, and automation in mundane buy-sell interactions. Later on in the article, the author notes that banks' cost of processing transactions on the Internet is much lower than through ATMs or human tellers. He also notes that economic development has historically followed the development of new types of networks, including roads, highways, canals, electrification systems, telephone systems, radio, television, airplane flight paths, etc., all of which enabled a reduction in transaction costs across the whole economy as well as greater flexibility in putting together, executing, and reconfiguring transactions and relationships.
The Internet is just the latest networking environment having an impact on economic transactions across the world economy. In that sense, it never created a new economy so much as juiced the one, only, and ongoing economy. And that's a good thing.
Jim
Pointer to article:
http://www.systemsmanagementpipeline.com/shared/article/printablePipelineArticle.jhtml;jsessionid=GY55IDOXY2MTUQSNDBGCKH0CJUMEKJVN?articleId=60404700
Kobielus kommentary:
By way of preface to this commentary, I should note that one of my first jobs was as an economic analyst. My undergraduate degree was in economics, from the University of Michigan—Ann Arbor, where I was an honors student and wrote an award-winning thesis on competition in the television industry. Anybody who’s ever really known me knows that once I start covering a topic, I never stop. Which partly explains why I’m usually out of breath. I continue to try to pull my older thoughts forward into my next personal career stage. Per a recent “self” posting, I’ve long regarded my core career as a student of the economic development of mass media—such as, especially, the Internet/Web, which is the first wholly new mass medium since the advent of television 60 years ago.
Anyway, my fundamental feeling is that world economic development is driven primarily by the emergence of new networks, though sometimes at a considerable lag from when the network was first introduced. Go back historically and look at the power of irrigation networks (basis for civilization); maritime, riparian, railroad, automotive, and aviation networks (basis for international trade, exploration, conquest, and industry); and electrical, telegraph, telephone, television, radio, and computer networks (basis for information society). Each of these networks can be associated with one or more “long waves,” per Schumpeter, Kondratieff, and other economists.
Networks drive development. They provide new ways of connecting people to each other and to the resources of the earth and sky. As transaction costs drop, economic development soars. An article I read a few years ago put the right perspective on this all. It provided a broader strategic framework for understanding the transaction-cost impacts of Moore's Law ("Every 18 months, processing power doubles while cost holds constant."), Metcalfe's Law ("The usefulness, or utility, of a network equals the square of the number of users."), and Coase's Law ("Firms are created because the additional cost of organizing them is cheaper than the transaction costs involved when individuals conduct business with each other using the market.").
The bottom line on all this is that:
• Declining transaction costs spur economic activity (per Coase)
• Transaction costs in today's economy keep declining due to declining costs for fundamental resources (processing power, per Moore; network connectivity, per Metcalfe)
• Transactional cost reductions throughout the economy lower the barriers to new competitors, mergers, acquisitions, divestitures, alliances, process reengineering, disintermediation, and other structural innovations
• Transactional cost differentiation (per Michael Porter's competitive/strategic framework) is the most powerful strategic differentiating weapon of all, making and breaking whole companies, industries, sectors, and economies
The fundamental value of networks (and all connected resources) as a resource to the economy/society may not be entirely quantifiable. The best we can say is that networks reduce transaction costs and make new types of transactions possible. And that all of this drives such economic indicators as innovation, investment, employment, and productivity, which can be quantified and satisfy some basic human needs and wants on which we can all agree.
Actually, these "laws" are just rules of thumb for understanding some important phenomena and trends. None of them has any scientific weight as a theoretical assertion confirmed through controlled observation and peer review. It's hard to see what the economic basis for Moore's Law might be, or what the psychological basis is for Metcalfe's Law's equation of squared-humans with some metric of collectivity happiness (after all, squares aren't much fun to party with), or what the sociological basis is for Coase's Law's implicit notion that a firm requires at least two individuals operating as an economic unit (what about us freelance writers? aren't we firms unto ourselves?).
I get the sense that the late 90s dot-com run-up in the market was no fluke, and it wasn't irrational. Collectively, society had begun to appreciate the discounted present value of future economic growth to be stimulated by the new, ubiquitous network (Internet) and transaction model (e-commerce). People bid equities up commensurate with that value. What society didn't figure, though, is that the future value will be driven/realized largely by the successors (or the successors to the successors) to the e-commerce companies who emerged in the late 90s. General rule: In a fast growing but new and volatile industry landscape, success will primarily accrue to the successors.
Microsoft has proved itself the ultimate successor time and again in so many markets. In this context, we can phrase “Gates' Law” as follows: As technology improves, new types of transactions with Microsoft will become more cost-effective, feasible, and, in fact, inevitable (and its corollary: the share of world income flowing to Redmond will increase).
All of which points up the fact that we've never had a "new economy," to use a term favored by many in the media these past ten years. The economy still runs on transaction costs, which juice liquidity. So much of this liquidity now rides the wire, but it still represents the time-old bottom line of balance sheets, financial exposure, and business and personal risk. In another article, Michael Porter and others debate whether there ever was a "new economy" immune to the historical cycles of capitalism. I like Porter's reductive comparison of the Internet to supermarket scanners (though others felt that was a ridiculous comparison): both innovations have contributed to a reduction in transaction costs, by supporting greater speed, flexibility, and automation in mundane buy-sell interactions. Later on in the article, the author notes that banks' cost of processing transactions on the Internet is much lower than through ATMs or human tellers. He also notes that economic development has historically followed the development of new types of networks, including roads, highways, canals, electrification systems, telephone systems, radio, television, airplane flight paths, etc., all of which enabled a reduction in transaction costs across the whole economy as well as greater flexibility in putting together, executing, and reconfiguring transactions and relationships.
The Internet is just the latest networking environment having an impact on economic transactions across the world economy. In that sense, it never created a new economy so much as juiced the one, only, and ongoing economy. And that's a good thing.
Jim
Sunday, March 13, 2005
fyi Microsoft to buy Groove Networks
All:
Pointer to article:
http://www.nwfusion.com/news/2005/0310microtobu.html
Kobielus kommentary:
Yeesh--I find myself habitually commenting on Microsoft announcements, as if they're where all the action is. I have to get out of the habit. But they certainly are a perennial action figure.
Anyway, as regards the Groove acquisition, this is one of those acquisitions that’s been expected for several years, and when it comes, it feels slightly disappointing. Though it’s definitely good for Microsoft and Groove (and their customers).
I’m actually happy that Microsoft waited till after the bloom was well off the “peer-to-peer” hype of a few years ago. Microsoft waited to see whether and how Groove and its approach would float in the post-bubble high-tech economy. Now Microsoft is buying Groove for all the right reasons: solid product, company, approach.
I’ve used Groove’s P2P collaboration tools and am not a fan. I consider Groove one of the clunkiest, overstuffed UIs in the collaboration space. Yeah, it’s sorta nice that I can cache all manner of external documents and other junk for offline perusal and don’t need to dial into some central server and so forth. But I found Groove’s stovepipe lack of integration with the rest of my collaboration clients—browser, e-mail, IM, etc.—to be frustrating and counterproductive. And I found Groove’s navigation paradigm to be maddeningly labyrinthine. I had been using it for months, and still was puzzled about where things were, or should be. And I got tired of being yoked to a “shared space” that I didn’t control—rather, others were dictating to me the organization of our shared environment—an organization that was optimal to them and bewildering to me.
I think Microsoft should take the good stuff from Groove—the P2P-based distributed file sharing/caching/synchronization features—and junk everything else. Microsoft should embed this core Groove technology into Office, Outlook, Windows Explorer, MSN Messenger, Live Communication Server, Internet Explorer, SharePoint Team Services, and the WiFi-based workgroup peer-LAN functionality within “Longhorn.” But present a rich-browser interface to it all (via XAML/”Avalon”) and radically simplify and converge the UI, integration, and management of all these scattered client and server components.
Because Microsoft’s collaboration environment is far too complex. They have no clear unification story on all this. If Ray Ozzie does anything as Microsoft’s collaboration CTO, it should be to bring some vision and discipline to reining in this growing mess. Ironically, if he does his job well, he’ll effectively have killed both of his babies: Groove technology as a stand-alone P2P environment, and Lotus Domino/Notes (or IBM Workplace, or whatever they’re calling it now) as a competitive force in the collaboration market.
Jim
Pointer to article:
http://www.nwfusion.com/news/2005/0310microtobu.html
Kobielus kommentary:
Yeesh--I find myself habitually commenting on Microsoft announcements, as if they're where all the action is. I have to get out of the habit. But they certainly are a perennial action figure.
Anyway, as regards the Groove acquisition, this is one of those acquisitions that’s been expected for several years, and when it comes, it feels slightly disappointing. Though it’s definitely good for Microsoft and Groove (and their customers).
I’m actually happy that Microsoft waited till after the bloom was well off the “peer-to-peer” hype of a few years ago. Microsoft waited to see whether and how Groove and its approach would float in the post-bubble high-tech economy. Now Microsoft is buying Groove for all the right reasons: solid product, company, approach.
I’ve used Groove’s P2P collaboration tools and am not a fan. I consider Groove one of the clunkiest, overstuffed UIs in the collaboration space. Yeah, it’s sorta nice that I can cache all manner of external documents and other junk for offline perusal and don’t need to dial into some central server and so forth. But I found Groove’s stovepipe lack of integration with the rest of my collaboration clients—browser, e-mail, IM, etc.—to be frustrating and counterproductive. And I found Groove’s navigation paradigm to be maddeningly labyrinthine. I had been using it for months, and still was puzzled about where things were, or should be. And I got tired of being yoked to a “shared space” that I didn’t control—rather, others were dictating to me the organization of our shared environment—an organization that was optimal to them and bewildering to me.
I think Microsoft should take the good stuff from Groove—the P2P-based distributed file sharing/caching/synchronization features—and junk everything else. Microsoft should embed this core Groove technology into Office, Outlook, Windows Explorer, MSN Messenger, Live Communication Server, Internet Explorer, SharePoint Team Services, and the WiFi-based workgroup peer-LAN functionality within “Longhorn.” But present a rich-browser interface to it all (via XAML/”Avalon”) and radically simplify and converge the UI, integration, and management of all these scattered client and server components.
Because Microsoft’s collaboration environment is far too complex. They have no clear unification story on all this. If Ray Ozzie does anything as Microsoft’s collaboration CTO, it should be to bring some vision and discipline to reining in this growing mess. Ironically, if he does his job well, he’ll effectively have killed both of his babies: Groove technology as a stand-alone P2P environment, and Lotus Domino/Notes (or IBM Workplace, or whatever they’re calling it now) as a competitive force in the collaboration market.
Jim
Friday, March 11, 2005
poem Able
ABLE
Mighty mighty meathooks,
oh how I love to grapple.
Immerse myself,
bloody beaut of a world,
I will.
Xactly where I want to be,
poised to fail.
You see me,
I'll be,
rising,
reaching,
able.
Mighty mighty meathooks,
oh how I love to grapple.
Immerse myself,
bloody beaut of a world,
I will.
Xactly where I want to be,
poised to fail.
You see me,
I'll be,
rising,
reaching,
able.
Wednesday, March 09, 2005
fyi Netgear Aims To Surmount Wireless 'Dead Zones'
All:
Pointer to article:
http://www.crn.com/sections/breakingnews/dailyarchives.jhtml?articleId=60406990
Kobielus kommentary:
One of WiFi’s drawbacks—indeed, any RF technology’s limitation—is signal fading, which causes dead zones. In order to optimize your WiFi cell to your office space, you would typically need to do rough-and-ready “coverage surveys”: in other words, move the access points around to see what positioning minimizes dead zones and maximizes signal coverage where it’s needed.
Ideally, your WiFi RF coverage plan should be self-configuring and self-optimizing: just install one or more access points at regularly spaced intervals throughout your space—as a first rough approximation--and rely on the access points themselves to auto-configure their power output, beam directions and widths, and so forth. Or, as a second best, rely on them to display to you—the human being who installed them—which way you should move them, and how far, to optimize signal coverage.
Adaptive antenna arrays are an important enabler for self-optimizing WiFi. We’re going to see adaptive arrays become standard in all WiFi access points before long. With adaptive arrays, each antenna housing has multiple internal antennas that are auto-configured dynamically to ensure optimal signal coverage, interference, and other parameters. Back in the day (1995-1998), when I was a product manager making RF test/measurement equipment for the cellular industry, I attended a conference at Virginia Tech on new frontiers in wireless tech. The most memorable session was a presentation on “Feasibility of Adaptive Arrays,” and I’ve been a firm believer in that technology ever since, not just for cellular systems (which have the benefit of teams of professional RF engineers) but especially for premises-based systems such as WiFi (which are often installed and optimized by non-RF professionals, who can benefit from the auto-optimization features).
Needless to say, I’m enthused about Netgear’s new RangeMax products that incorporate adaptive array technology. Per their press release:
“RangeMax is an advanced Smart MIMO (Multi-In, Multi-Out) technology that uses seven internal antennas. RangeMax constantly surveys your home environment for physical barriers and interference and adjusts the wireless signal to compensate for these performance blockers. For example, if you carry your laptop from the family room to the bedroom, RangeMax automatically senses the change and selects from over 100 possible antenna configurations to deliver you the fastest, clearest connection! Everyone will enjoy consistently high-speed connections, everywhere in your house - no drop-outs, and no dead spots.”
Way cool. And essential for the continued deployment of WiFi into every available indoor space across the whole wide world.
Jim
P.S. Here’s the poem I wrote in 1996 that leveraged this totally geeky concept into a mystical/spiritual riff (see if you can spot the acrostic):
*******************************
FEASIBILITY OF ADAPTIVE ARRAYS
Trained to receive, every fiber of me, every facet of you.
Unimpeded, one path, one singular path, through crowded sensations.
Recognizing our kind.
Now turning away.
Trying to receive, every moment of time, every flicker of thought and soft emanation.
Unobstructed, then blocked, then scattered through layers, accidental relations.
Recollecting our mind.
Now turning away.
True to my thoughts, we're moving in tandem.
Unawares, we're speaking the language of light.
Reconstructing our world through massive obstruction.
Now moving closer, now turning away.
*******************************
Read the story. Now read the poem. Go back to the story. Then the poem. It’ll sink in. Most people wouldn’t have a clue what I’m talking about. But you, my blogreaders, are techies too. Nature abhors a dead zone.
Pointer to article:
http://www.crn.com/sections/breakingnews/dailyarchives.jhtml?articleId=60406990
Kobielus kommentary:
One of WiFi’s drawbacks—indeed, any RF technology’s limitation—is signal fading, which causes dead zones. In order to optimize your WiFi cell to your office space, you would typically need to do rough-and-ready “coverage surveys”: in other words, move the access points around to see what positioning minimizes dead zones and maximizes signal coverage where it’s needed.
Ideally, your WiFi RF coverage plan should be self-configuring and self-optimizing: just install one or more access points at regularly spaced intervals throughout your space—as a first rough approximation--and rely on the access points themselves to auto-configure their power output, beam directions and widths, and so forth. Or, as a second best, rely on them to display to you—the human being who installed them—which way you should move them, and how far, to optimize signal coverage.
Adaptive antenna arrays are an important enabler for self-optimizing WiFi. We’re going to see adaptive arrays become standard in all WiFi access points before long. With adaptive arrays, each antenna housing has multiple internal antennas that are auto-configured dynamically to ensure optimal signal coverage, interference, and other parameters. Back in the day (1995-1998), when I was a product manager making RF test/measurement equipment for the cellular industry, I attended a conference at Virginia Tech on new frontiers in wireless tech. The most memorable session was a presentation on “Feasibility of Adaptive Arrays,” and I’ve been a firm believer in that technology ever since, not just for cellular systems (which have the benefit of teams of professional RF engineers) but especially for premises-based systems such as WiFi (which are often installed and optimized by non-RF professionals, who can benefit from the auto-optimization features).
Needless to say, I’m enthused about Netgear’s new RangeMax products that incorporate adaptive array technology. Per their press release:
“RangeMax is an advanced Smart MIMO (Multi-In, Multi-Out) technology that uses seven internal antennas. RangeMax constantly surveys your home environment for physical barriers and interference and adjusts the wireless signal to compensate for these performance blockers. For example, if you carry your laptop from the family room to the bedroom, RangeMax automatically senses the change and selects from over 100 possible antenna configurations to deliver you the fastest, clearest connection! Everyone will enjoy consistently high-speed connections, everywhere in your house - no drop-outs, and no dead spots.”
Way cool. And essential for the continued deployment of WiFi into every available indoor space across the whole wide world.
Jim
P.S. Here’s the poem I wrote in 1996 that leveraged this totally geeky concept into a mystical/spiritual riff (see if you can spot the acrostic):
*******************************
FEASIBILITY OF ADAPTIVE ARRAYS
Trained to receive, every fiber of me, every facet of you.
Unimpeded, one path, one singular path, through crowded sensations.
Recognizing our kind.
Now turning away.
Trying to receive, every moment of time, every flicker of thought and soft emanation.
Unobstructed, then blocked, then scattered through layers, accidental relations.
Recollecting our mind.
Now turning away.
True to my thoughts, we're moving in tandem.
Unawares, we're speaking the language of light.
Reconstructing our world through massive obstruction.
Now moving closer, now turning away.
*******************************
Read the story. Now read the poem. Go back to the story. Then the poem. It’ll sink in. Most people wouldn’t have a clue what I’m talking about. But you, my blogreaders, are techies too. Nature abhors a dead zone.
self Careers
All:
I have a philosophy of careers, just as I have a philosophy of almost everything. I get my jollies.
On the one hand, careers are often just one damn job after another. A chain of steppingstone opportunities starting from the day you first filled out an application to the day you hang it all up and go to that lounge chair in the sky, or under Florida skies.
On the other, careers are all that-—plus the story you tell, to yourself and others, that connects all those seemingly accidental steps into something resembling a life plan-—what it all amounts to, and what you yourself are amounting to. The meaning or purpose or end result of it all. How your presence here made something resembling an enduring contribution. Or simply something transient and meaningful to you alone as some sort of private joke or quixotic quest. How you rolled with the flow and were ultimately floated to some new plateau—hopefully high enough to see over the next incoming wave.
That story you tell about what it all means—it’s written up in your resume (the dots) and recited in your interview with the next potential employer (the connections among the dots, aligning your career trajectory with that employer’s trajectory, as best you can). Sometimes—often—we surprise ourselves by the connections we draw, when asked to explain “what do you want to do with your career.” Sometimes—often—we don’t truly know exactly where we’re going until we see a shiny new vehicle and ask to be taken aboard.
I’m no mystic, and I don’t pretend to understand if God has a plan for me, or if anything I’ve done will amount to much in the final analysis. The “what it’s amounting to” can be answered on several levels. One metric is your bank account. Another is the length and depth of your resume (and, in my case, considering how much I’ve published these past 20 years, my bibliography). Yet another is your Rolodex, or your e-mail address book, or simply the list of professional friends (or, hopefully, friend-friends) you’ve accumulated over the years.
But, fundamentally, it comes down to that story. Is my next step advancing me along that path toward whatever it is I see my life amounting to? How can I know till I’ve taken that step whether it was truly the best step?
As people say, God has plans for all of us. But I think God delegates a lot of the planning to each of us individually. He just rolls it all up at the end of eternity into some master plan that gets stuffed into a drawer somewhere. And stays there.
Jim
I have a philosophy of careers, just as I have a philosophy of almost everything. I get my jollies.
On the one hand, careers are often just one damn job after another. A chain of steppingstone opportunities starting from the day you first filled out an application to the day you hang it all up and go to that lounge chair in the sky, or under Florida skies.
On the other, careers are all that-—plus the story you tell, to yourself and others, that connects all those seemingly accidental steps into something resembling a life plan-—what it all amounts to, and what you yourself are amounting to. The meaning or purpose or end result of it all. How your presence here made something resembling an enduring contribution. Or simply something transient and meaningful to you alone as some sort of private joke or quixotic quest. How you rolled with the flow and were ultimately floated to some new plateau—hopefully high enough to see over the next incoming wave.
That story you tell about what it all means—it’s written up in your resume (the dots) and recited in your interview with the next potential employer (the connections among the dots, aligning your career trajectory with that employer’s trajectory, as best you can). Sometimes—often—we surprise ourselves by the connections we draw, when asked to explain “what do you want to do with your career.” Sometimes—often—we don’t truly know exactly where we’re going until we see a shiny new vehicle and ask to be taken aboard.
I’m no mystic, and I don’t pretend to understand if God has a plan for me, or if anything I’ve done will amount to much in the final analysis. The “what it’s amounting to” can be answered on several levels. One metric is your bank account. Another is the length and depth of your resume (and, in my case, considering how much I’ve published these past 20 years, my bibliography). Yet another is your Rolodex, or your e-mail address book, or simply the list of professional friends (or, hopefully, friend-friends) you’ve accumulated over the years.
But, fundamentally, it comes down to that story. Is my next step advancing me along that path toward whatever it is I see my life amounting to? How can I know till I’ve taken that step whether it was truly the best step?
As people say, God has plans for all of us. But I think God delegates a lot of the planning to each of us individually. He just rolls it all up at the end of eternity into some master plan that gets stuffed into a drawer somewhere. And stays there.
Jim
Tuesday, March 08, 2005
fyi Microsoft Outlines Two-Phase Business Application Plan
All:
Pointer to article:
http://www.informationweek.com/story/showArticle.jhtml?articleID=60406981
http://www.microsoft.com/presspass/press/2005/mar05/03-07Convergence05UmbrellaPR.asp
Kobielus kommentary:
Now this is a roadmap that makes sense, on a couple of levels.
Support-wise, there’s no way that Microsoft’s ERP/CRM/etc app customers would tolerate anything less than an eight-year mainstream support lifecycle. That’s what SAP offers its customers, and it sets the high bar that Microsoft is attempting to polevault. Extending mainstream support on Great Plains etc to 2013 was a critical component of securing existing customers’ ongoing loyalty. Packaged business apps have very long lifecycles in customer deployments. Microsoft can’t attempt to substantially migrate older customers overnight to the “Project Green” codebase, or even over the next 5 years. Customers will migrate only if and when it makes sense. They’ve built their internal business processes on existing ERP/CRM/etc apps, and those legacy codebases are part of customers’ business-process DNA.
Rollout-wise, Microsoft did the right thing by spreading out “Project Green” enhancements over two separate releases. The first and most critical step toward the “Project Green” architecture was to integrate the presentation tiers of all existing Microsoft ERP apps, providing a shared presentation tier that knits all legacy apps into a common collaboration environment. They’re relying on their SharePoint Portal Server in the same way that SAP has built its NetWeaver architecture around the SAP Enterprise Server, as a unified presentation tier front-ending both the new generation of apps (mySAP) and the legacy generation (R/3). In both cases, Microsoft and SAP, the portal is more than a presentation tier: it’s also the platform for a wide range of collaborative functionality. I also think Microsoft is doing the right thing by providing a core group of common user roles that will be common to all new and legacy ERP/etc apps. Collaborative commerce demands an identity management infrastructure that supports what you might call “federated role management” among diverse business domains.
I’d like to see Microsoft go a little further in “Project Green” phase two. It’s not enough to simply declare they’ll provide a common VS.NET model-driven development environment that spans all ERP/etc. apps. Microsoft should be exposing all fine-grained ERP/etc app functionality--“Project Green” and legacy--as Web services, include the WSDL definitions of all that functionality in its UDDI registry, and provide a visual VS.NET/Visio/UML/WS-BPEL orchestration tool that makes “connecting the dots” among all those features as easy as possible.
Microsoft should be providing much more detail on its “Project Green” roadmap soon if it hopes to get a jump on SAP and Oracle for the high-end of the ERP/etc market. Those other vendors have a more aggressive, farsighted SOA-based approach to their business app product families.
Jim
Pointer to article:
http://www.informationweek.com/story/showArticle.jhtml?articleID=60406981
http://www.microsoft.com/presspass/press/2005/mar05/03-07Convergence05UmbrellaPR.asp
Kobielus kommentary:
Now this is a roadmap that makes sense, on a couple of levels.
Support-wise, there’s no way that Microsoft’s ERP/CRM/etc app customers would tolerate anything less than an eight-year mainstream support lifecycle. That’s what SAP offers its customers, and it sets the high bar that Microsoft is attempting to polevault. Extending mainstream support on Great Plains etc to 2013 was a critical component of securing existing customers’ ongoing loyalty. Packaged business apps have very long lifecycles in customer deployments. Microsoft can’t attempt to substantially migrate older customers overnight to the “Project Green” codebase, or even over the next 5 years. Customers will migrate only if and when it makes sense. They’ve built their internal business processes on existing ERP/CRM/etc apps, and those legacy codebases are part of customers’ business-process DNA.
Rollout-wise, Microsoft did the right thing by spreading out “Project Green” enhancements over two separate releases. The first and most critical step toward the “Project Green” architecture was to integrate the presentation tiers of all existing Microsoft ERP apps, providing a shared presentation tier that knits all legacy apps into a common collaboration environment. They’re relying on their SharePoint Portal Server in the same way that SAP has built its NetWeaver architecture around the SAP Enterprise Server, as a unified presentation tier front-ending both the new generation of apps (mySAP) and the legacy generation (R/3). In both cases, Microsoft and SAP, the portal is more than a presentation tier: it’s also the platform for a wide range of collaborative functionality. I also think Microsoft is doing the right thing by providing a core group of common user roles that will be common to all new and legacy ERP/etc apps. Collaborative commerce demands an identity management infrastructure that supports what you might call “federated role management” among diverse business domains.
I’d like to see Microsoft go a little further in “Project Green” phase two. It’s not enough to simply declare they’ll provide a common VS.NET model-driven development environment that spans all ERP/etc. apps. Microsoft should be exposing all fine-grained ERP/etc app functionality--“Project Green” and legacy--as Web services, include the WSDL definitions of all that functionality in its UDDI registry, and provide a visual VS.NET/Visio/UML/WS-BPEL orchestration tool that makes “connecting the dots” among all those features as easy as possible.
Microsoft should be providing much more detail on its “Project Green” roadmap soon if it hopes to get a jump on SAP and Oracle for the high-end of the ERP/etc market. Those other vendors have a more aggressive, farsighted SOA-based approach to their business app product families.
Jim
Friday, March 04, 2005
lol The Eggcorn Database
All:
Pointer to site:
http://eggcorns.lascribe.net/
Kobielus kommentary:
Brilliant! I’m so glad somebody’s keeping track of these written malapropisms, which they refer to as “eggcorns.” Here’s their definition of an “eggcorn”:
o “This site collects unusual spellings of a particular kind, which have come to be called eggcorns. Typical examples include free reign (instead of free rein) or hone in on (instead of home in on), and many more or less common reshapings of words and expressions: a word or part of a word is semantically reanalyzed, and the spelling reflects the new interpretation. The About page offers more information on the history of the term and of this collection.”
And they doubly impressed me by having “baited breath” (which is a very common misspelling of “bated breath”). My mnemonic for this particular item is that “bated” is a shortening of “abated,” which means to “reduce the force or intensity of.” Hence, someone may wait with “bated breath”—in other words, self-consciously reduced, quieted, calmed breathing—for some anticipated event. But no one waits with “baited breath,” unless they’ve swallowed a bucket of minnows. Hey, a new stunt for “Fear Factor”! Here’s an ersatz Confucian proverb I made up in 1987, when this “eggcorn” jumped out at me:
o “He who waits with baited breath catches trout with forked tongue.”
Before I close this post, I'd like to note that these "eggcorns" are beautiful examples of "language as an object worthy of contemplation," per a previous "imho" post. Language's ambiguity (deliberate or accidental) is part of what can make poetical or rhetorical expression so powerful. The best poetry/rhetoric carries inside itself the penumbra of all the unspoken, invisible, and latent words, feelings, and concepts that the written words suggest or imply. Let me close this post with an example, a poem that I wrote a while ago. In reading this little capsule, ask yourself which implicit words/thoughts scream from the sidelines, and how they shade/inform the written words/thoughts. Half the time, I'm not entirely conscious of the penumbra poem until I've laid down all the visible words. OK, here's the example poem:
*********************
COMMENT
Deaden the anger with air
and a prayer for forgiveness.
Live it through. A sore outlasts
its irritant: a pain, the
point. A rash word holds the hurt
in the firmament of the
world’s regard. Curse: a comet
impressed into flesh’s fate,
a stone with a name, a mark
and a flame returning. Leave
these currents burn. Let the night
reset what day has torn. Pray
it take this weary frame. Void
me now in the calm to come.
*********************
Jim
Pointer to site:
http://eggcorns.lascribe.net/
Kobielus kommentary:
Brilliant! I’m so glad somebody’s keeping track of these written malapropisms, which they refer to as “eggcorns.” Here’s their definition of an “eggcorn”:
o “This site collects unusual spellings of a particular kind, which have come to be called eggcorns. Typical examples include free reign (instead of free rein) or hone in on (instead of home in on), and many more or less common reshapings of words and expressions: a word or part of a word is semantically reanalyzed, and the spelling reflects the new interpretation. The About page offers more information on the history of the term and of this collection.”
And they doubly impressed me by having “baited breath” (which is a very common misspelling of “bated breath”). My mnemonic for this particular item is that “bated” is a shortening of “abated,” which means to “reduce the force or intensity of.” Hence, someone may wait with “bated breath”—in other words, self-consciously reduced, quieted, calmed breathing—for some anticipated event. But no one waits with “baited breath,” unless they’ve swallowed a bucket of minnows. Hey, a new stunt for “Fear Factor”! Here’s an ersatz Confucian proverb I made up in 1987, when this “eggcorn” jumped out at me:
o “He who waits with baited breath catches trout with forked tongue.”
Before I close this post, I'd like to note that these "eggcorns" are beautiful examples of "language as an object worthy of contemplation," per a previous "imho" post. Language's ambiguity (deliberate or accidental) is part of what can make poetical or rhetorical expression so powerful. The best poetry/rhetoric carries inside itself the penumbra of all the unspoken, invisible, and latent words, feelings, and concepts that the written words suggest or imply. Let me close this post with an example, a poem that I wrote a while ago. In reading this little capsule, ask yourself which implicit words/thoughts scream from the sidelines, and how they shade/inform the written words/thoughts. Half the time, I'm not entirely conscious of the penumbra poem until I've laid down all the visible words. OK, here's the example poem:
*********************
COMMENT
Deaden the anger with air
and a prayer for forgiveness.
Live it through. A sore outlasts
its irritant: a pain, the
point. A rash word holds the hurt
in the firmament of the
world’s regard. Curse: a comet
impressed into flesh’s fate,
a stone with a name, a mark
and a flame returning. Leave
these currents burn. Let the night
reset what day has torn. Pray
it take this weary frame. Void
me now in the calm to come.
*********************
Jim
Wednesday, March 02, 2005
lol Microsoft Patches "Blue Screen Of Death" In Windows XP SP2
All:
Pointer to article:
http://www.techweb.com/wire/60402941
Kobielus kommentary:
What the…? Even the blue screen of death is buggy? Can’t Microsoft even engineer a graceful death sequence? They ought to consult with Dr. Jack Kevorkian.
Jim
Pointer to article:
http://www.techweb.com/wire/60402941
Kobielus kommentary:
What the…? Even the blue screen of death is buggy? Can’t Microsoft even engineer a graceful death sequence? They ought to consult with Dr. Jack Kevorkian.
Jim
fyi GSM at sea
All:
Pointer to article:
www.cellular-news.com/story/12153.shtml
Kobielus kommentary:
Isn’t this the core market that Iridium and all those other defunct global mobile satellite carriers intended to address? So pathetically puny. Even in the early 90s, when all of these ventures were in the insane-hype stage, you wonder what all the VCs were smoking when they greenlighted these pipedreams. Even then, the terrestrial cellular carriers had a lockhold on the bulk of the mobility market. Now that GSM base stations are being deployed on cruise ships, it feels like the final coup de grace. I’m curious (the article doesn’t say) what satellite service is providing the link between the shipside GSM network and the “adjacent” shoreside terrestrial GSM network(s). I’m also curious if the shipside GSM operators have shipside WiFi-to-GPRS roaming (and if they have WiFi service to begin with). I don’t know about you, but if I were cooped up on a vomitrocious vessel for days or weeks, I’d at least want a broadband porthole to stare out of.
Can you tell that I didn’t enjoy my one and only experience with cruise ships? The worst thing about them is being inside the vessel, feeling the rocking, but not being able to see the horizon. That’s when seasickness really takes hold. If I could have slept up on deck, in full view of the ocean, I would have quickly gained my sea legs. Or if I could have had a full view of the ocean of media from the shoreside world, it would have been less disorienting. As it was, I had to make do with junky lounge acts, second-rate Hollywood movies, and semi-edible continuous buffets.
Gag me with a spoon.
Jim
Pointer to article:
www.cellular-news.com/story/12153.shtml
Kobielus kommentary:
Isn’t this the core market that Iridium and all those other defunct global mobile satellite carriers intended to address? So pathetically puny. Even in the early 90s, when all of these ventures were in the insane-hype stage, you wonder what all the VCs were smoking when they greenlighted these pipedreams. Even then, the terrestrial cellular carriers had a lockhold on the bulk of the mobility market. Now that GSM base stations are being deployed on cruise ships, it feels like the final coup de grace. I’m curious (the article doesn’t say) what satellite service is providing the link between the shipside GSM network and the “adjacent” shoreside terrestrial GSM network(s). I’m also curious if the shipside GSM operators have shipside WiFi-to-GPRS roaming (and if they have WiFi service to begin with). I don’t know about you, but if I were cooped up on a vomitrocious vessel for days or weeks, I’d at least want a broadband porthole to stare out of.
Can you tell that I didn’t enjoy my one and only experience with cruise ships? The worst thing about them is being inside the vessel, feeling the rocking, but not being able to see the horizon. That’s when seasickness really takes hold. If I could have slept up on deck, in full view of the ocean, I would have quickly gained my sea legs. Or if I could have had a full view of the ocean of media from the shoreside world, it would have been less disorienting. As it was, I had to make do with junky lounge acts, second-rate Hollywood movies, and semi-edible continuous buffets.
Gag me with a spoon.
Jim
Subscribe to:
Posts (Atom)