In discussing the problem of real time with a corporate user-experience manager, the question was posed: “how should one manage resources to respond to a real-time event using a corporate-level content management system?” In the past, real-time events — let’s say large storms or earthquakes — had caused an all hands on deck response. Publishing timely information required many hours of production time to surface the appropriate information into the corporate web presence.
The problem is really one of information hierarchy and architecture as controlled and structured by a corporate content management system. When a batch publishing system is used to respond to a real-time event, the cost of publishing is very high. On the other hand, when the time of the event is controlled by the corporation — a campaign or a product release—the batch CMS software performs with the anticipated economics. The early days of the web were dominated by this kind of controlled publishing, and automated systems were developed to manage it.
One difficulty with the real-time event is that it doesn’t have a permanent home in the tree-structured information hierarchy except as a generalized real-time event. Placing real-time content in a semantically proper position in the information architecture is sensible, but fails to fully understand the time-value of the external event. These events are generally pasted on to the home page through a special announcement widget with a hyperlink to a page that hovers around the edge of information hierarchy. In the end, it becomes an orphan page and is deleted, because once time passes, it no longer has a place that makes sense.
Blogs and Twitter have emerged under the moniker of social media, but at bottom they’re instant publishing systems. But the key difference is, their information hierarchy is completely flat, items are posted in chronological order without regard for their semantic position in a system of meaning. On a secondary level, the items can be searched and categorized into ad hoc sense-making collections. In a recent online interview, IBM’s Jeff Jonas noted that batch systems never evolve into real-time systems; but that real-time systems can spin up batches all day. This approach to information publishing and organization was pioneered by David Gelernter and variously called chronicle streams, lifestreams, and information beams. In this model, time has priority over information space. The structure of real-time is based, not surprisingly, on the unfolding of time.
The real-time Network enables social media through the low-latency exchange of messages. But socializing is only one aspect that’s enabled by real-time systems. The National Security Agency continues to push the boundaries of what is possible on this front. It’s not just people who interact in real time, each and every day, our whole world is filled with surprising and unexpected events.
The real-time event is changing the corporate web presence in a number of ways. Real-time sense making of data captured in real-time is clearly coming. But the real-time visibility that Twitter and other real-time systems are providing to users have put the pressure on corporate enterprises to respond in close to real time. The content management systems that the enterprise has invested in are, for the most part, the wrong tool for the job. Initially batch publishing systems will need to be supplemented with real-time systems. Eventually, corporate web presence will be managed using a fully real-time system.
The interesting question is whether corporate web presence will be published as a time-sequenced real-time feed and secondarily made available in ad hoc personalized data collections — or whether the traditional tree structure, the hierarchical information architecture, will continue to dominate: batch vs. real-time architectures.