Skip to content →

Tag: time

Boundaries of the Real-Time Conversational Flow through the Network

The time tunnel

There are many kinds of “time.” Sometimes we use the adjective real to describe time. As we talk about the Live Web and begin to imagine the possibilities of XMPP, a new set of online experiences come in to focus. Real time computing has come to mean very short system response times. But how short is short? Where are the borders of the real time experience? What are the human factors?

Jakob Nielsen is as good a place to start as any. In his book Usability Engineering, he discusses Robert B. Miller’s classic paper on Response Time in Man-Computer Conversational Transactions. Miller talks about three threshold levels of human attention.

  • 0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result.
  • 1.0 second is about the limit for the user’s flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data.
  • 10 seconds is about the limit for keeping the user’s attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. Feedback during the delay is especially important if the response time is likely to be highly variable, since users will then not know what to expect.

The other rule of thumb is Akscyn’s Law:

  • Hypertext systems should take about 1/4 second to move from one place to another. 
  • If the delay is longer people may be distracted. 
  • If the delay is much longer, people will stop using the system. 
  • If the delay is much shorter, people may not realize that the display has changed. 

This puts the range of real time interaction between 1/10 and 1/4 of a second. This gives us some sense of the boundaries for the flow of a real time conversation through the network. The maxim that “faster is better” is supported in the laboratory. Experimental research by Hoxmeier and DiCesare on user satisfaction and system response time for web-based applications reported findings on the following hypotheses:

Satisfaction decreases as response time increases: Supported 

Dissatisfaction leads to discontinued use: Supported 

Ease of use decreases as satisfaction decreases: Supported 

Experienced users more tolerant of slower response times:  Not Supported 

But in the war against latency in system response has gone well beyond tenths of a second to thousandths of a second. The front lines of that battle are on Wall Street, or New Jersey to be more specific. Richard Martin of Information Week reports on data latency and trading in Wall Street & Technology.

Firms are turning to electronic trading, in part because a 1-millisecond advantage in trading applications can be worth millions of dollars a year to a major brokerage firm. That is why colocation — in which firms move the systems running their algorithms as close to the exchanges as possible — is so popular.

Wall Street isn’t stopping at milliseconds: “Five years ago we were talking seconds, now we’re into the milliseconds,” says BATS’ Cummings. “Five years from now we’ll probably be measuring latency in microseconds.”

If services like Twitter are going to scale up to become primary gesture/attention markets they’ll need to extend their real-time flow via an API to their partners. If they’re going to get that right, they’ll need to focus on delivering high volume, high quality data liquidity. The key question is under what terms that data will be available. The economics of real time stock exchange data is well established. Information asymmetry models assume that at least one party to a transaction has relevant information whereas the other(s) do not. Relevant information is a tradable advantage. Initially we just need enough speed to keep the conversation flow alive. But a live conversation is only the beginning of the creation of tangible value. The architecture of Wall Street’s trading systems provides us with a view into our future need for speed.

TS Eliot

Real time is important only as it relates to future time. Real time data is the input into the Bayesian calculation of the probability of future outcomes. Predicting the future correctly is big business. To understand the meaning of the flow of time, perhaps it’s best to start with T.S. Eliot.

Time present and time past
Are both perhaps present in time future
And time future contained in time past.
If all time is eternally present
All time is unredeemable.
What might have been is an abstraction
Remaining a perpetual possibility
Only in a world of speculation.
What might have been and what has been
Point to one end, which is always present.

Burnt Norton by T.S. Eliot

 

One Comment

Writing to the Stream: As Time Goes By

Heraclitus

Jon Udell talks about teaching civilians about syndication. This, of course, makes me think of Heraclitus. Udell would like his local school to stop posting calendars in PDF format and start using something like iCal, a format with a more formal structure. The idea is to write events that stream across a calendar– something that can be subscribed to, parsed, mixed and mashed up. The reason that it’s hard to change the way people think about data is that the stream is not part of the metaphor we put in front of our operating systems.

There is nothing permanent except change. 
                         – Heraclitus

The file system is dead,” The guy who said that agrees with Jon Udell. His name is David Gelernter, and he’s one of the first people to talk about organizing data in terms of time rather than space. Lifestreams was something Gelernter talked about before there was Flickr, FaceBook, Twitter or FriendFeed. It’s simply a matter of changing the metaphor of the file system from a desk, file cabinets and a trashcan to something that more adequately fits the contours of our lives. In case you hadn’t noticed, we live our lives in both space and time.

What if instead of saving to a file or printing something out, we saved to a stream. What if that was acting within the normal metaphor for Human-Computer Interaction? We’ve come a long way with the graphic user interface metaphors developed by Doug Engelbart and the folks at Xerox Parc, but we’re in a period of transition. We’re moving from the solipsistic unNetworked desktop computer to the always already connected Network dashboard. We have an opportunity to expand the user interface metaphor we place between ourselves and the new internet operating system to include the concepts of time and the stream.

The other starting point for thinking about time-bound, documented objects in a stream is with Bruce Sterling’s idea of Spimes. He discusses the kind of design thinking that might go in to creating Spimes in his book Shaping Things. Boing Boing offers this summary:

A Spime is a location-aware, environment-aware, self-logging, self-documenting, uniquely identified object that flings off data about itself and its environment in great quantities. A universe of Spimes is an informational universe…

Sterling is speaking to the culture of industrial designers and the ecosystem of the manufactured object. But, of course, this doesn’t help with the problem of Jon Udell’s local school calendar.

Just as we’re always already part of the Network, all the marks we make are part of a stream. We keep the stream private and the make sections of it public when we choose to. It’s with Ward Cunningham’s idea of the Wiki that the document as a current public version begins to get purchase. Google Docs extends the metaphor to the typical office application suite. As Microsoft moves into the Network with Live Mesh, it has some opportunities to create foundational pieces of the new metaphor.

You could not step twice into the same rivers; for other waters are ever flowing on to you.
                          – Heraclitus

To understand the state of writing to the stream, all we need to do is look at what FriendFeed aggregates. To understand what the most common writing implements are, we can examine what makes up the flow that passes through FriendFeed. No doubt we’d find the usual suspects, Blogs via RSS, Twitter, Flickr, Delicious, YouTube, etc. Upcoming is the tool that writes events to the stream. Where, you may ask, is Microsoft’s Office in all this? While Outlook can export an iCal file, it is unable to publish it to a stream. It’s as though the program is unaware that it’s part of a Network and meant to serve humans who live their lives in a stream of space and time. The writing implements and storage metaphors of the new internet operating system must take the stream of time into the foundation of their UI metaphor. Once our tools understand and inhabit their proper ecosystem, Jon Udell’s local school will stop posting calendars as PDFs.

Of course, there is a psychological hurdle when it comes to incorporating time into our new tool set. It reminds us that we are mortals, and our time is not unlimited.

Comments closed