Skip to content →

Category: markets

Twitter Scrabble

Twitter Scrabble

Now the Scrabble dictionary doesn’t seem to think that the word “OK” is legal. But some would argue for its acceptance. Gavin Rude, an english teacher, makes a gallant attempt in this PDF. However, given the current politcal climate around online scrabble– Kevin Marks and Brian Oberkirch could be viewed as colluding to play team scrabble through Twitter without permission from the Hasbro corporation.

I’m expecting take down notices all around.

Comments closed

The Razor and the Blade: Kumbaya Economics

There are a number of narratives located in the words “open source.” The most dominant narrative is the story about software development and maintenance through tightly coordinated iterations via inputs from a potentially unlimited and unbounded number of interested parties. The economics of open source require the diversification of the carriers of value from the traditional modes. I’ve purposefully begun this exploration with economics rather than the concept of free access to source code.

It’s the idea of “free” that has expanded to connect up with other “free narratives” to create confusion. It’s a kind of utopian vision: free beer, free speech, free love, free software. A binary opposition is generated that pits free + generosity against price + greed. The moral elements of the equation rise to the surface when comparing alternative software solutions. There’s a utopian narrative that has attached itself to open source software and simultaneously detached itself from any rational economics. It’s a story of free beer rather than free speech, and is utopian in its original meaning of “no place.”

Safety Razor

Chris Anderson has focused the conversation with his forthcoming book called “Free.” The emerging economic model he describes is woven from value transactions across multiple delivery and product modes– some free others at a cost. This blend results in a sustainable economic system. It’s the combined value of the whole set that matters, not the percentage of free delivery modes vs. pay delivery modes. And as we move further into the attention-gesture economy, the methods of payment will be more diversified as well. One-hundred-percent free in all modes, for all time, is simply a method of incurring debt. At some point the system has to come back into balance, either through the addition of a revenue component or bankruptcy. Hobbyist or enthusiast systems work through the attention-gesture economy, but so do services like The Google.

There are thousands of open source projects, but the ones that combine well with commercial projects are the most active and well supported. The number of active projects is actually quite small. Entrepreneurs are constantly searching for new combinations to produce excess value at viable margins. As products become more modular, value migrates to design. Apple’s operating system combines open source infrastructure with a highly-customized human interface. The combination creates superior value.

There’s a temptation to believe that all the players in a commercial market should contribute openly to the commons– that we should all come together and sing kumbaya. The fact that every new digital product will contain some form of open source module doesn’t change the competitive landscape. Companies may sing kumbaya, but they still wield the razor and the blade, and that’s as it should be.

6 Comments

Boundaries of the Real-Time Conversational Flow through the Network

The time tunnel

There are many kinds of “time.” Sometimes we use the adjective real to describe time. As we talk about the Live Web and begin to imagine the possibilities of XMPP, a new set of online experiences come in to focus. Real time computing has come to mean very short system response times. But how short is short? Where are the borders of the real time experience? What are the human factors?

Jakob Nielsen is as good a place to start as any. In his book Usability Engineering, he discusses Robert B. Miller’s classic paper on Response Time in Man-Computer Conversational Transactions. Miller talks about three threshold levels of human attention.

  • 0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result.
  • 1.0 second is about the limit for the user’s flow of thought to stay uninterrupted, even though the user will notice the delay. Normally, no special feedback is necessary during delays of more than 0.1 but less than 1.0 second, but the user does lose the feeling of operating directly on the data.
  • 10 seconds is about the limit for keeping the user’s attention focused on the dialogue. For longer delays, users will want to perform other tasks while waiting for the computer to finish, so they should be given feedback indicating when the computer expects to be done. Feedback during the delay is especially important if the response time is likely to be highly variable, since users will then not know what to expect.

The other rule of thumb is Akscyn’s Law:

  • Hypertext systems should take about 1/4 second to move from one place to another. 
  • If the delay is longer people may be distracted. 
  • If the delay is much longer, people will stop using the system. 
  • If the delay is much shorter, people may not realize that the display has changed. 

This puts the range of real time interaction between 1/10 and 1/4 of a second. This gives us some sense of the boundaries for the flow of a real time conversation through the network. The maxim that “faster is better” is supported in the laboratory. Experimental research by Hoxmeier and DiCesare on user satisfaction and system response time for web-based applications reported findings on the following hypotheses:

Satisfaction decreases as response time increases: Supported 

Dissatisfaction leads to discontinued use: Supported 

Ease of use decreases as satisfaction decreases: Supported 

Experienced users more tolerant of slower response times:  Not Supported 

But in the war against latency in system response has gone well beyond tenths of a second to thousandths of a second. The front lines of that battle are on Wall Street, or New Jersey to be more specific. Richard Martin of Information Week reports on data latency and trading in Wall Street & Technology.

Firms are turning to electronic trading, in part because a 1-millisecond advantage in trading applications can be worth millions of dollars a year to a major brokerage firm. That is why colocation — in which firms move the systems running their algorithms as close to the exchanges as possible — is so popular.

Wall Street isn’t stopping at milliseconds: “Five years ago we were talking seconds, now we’re into the milliseconds,” says BATS’ Cummings. “Five years from now we’ll probably be measuring latency in microseconds.”

If services like Twitter are going to scale up to become primary gesture/attention markets they’ll need to extend their real-time flow via an API to their partners. If they’re going to get that right, they’ll need to focus on delivering high volume, high quality data liquidity. The key question is under what terms that data will be available. The economics of real time stock exchange data is well established. Information asymmetry models assume that at least one party to a transaction has relevant information whereas the other(s) do not. Relevant information is a tradable advantage. Initially we just need enough speed to keep the conversation flow alive. But a live conversation is only the beginning of the creation of tangible value. The architecture of Wall Street’s trading systems provides us with a view into our future need for speed.

TS Eliot

Real time is important only as it relates to future time. Real time data is the input into the Bayesian calculation of the probability of future outcomes. Predicting the future correctly is big business. To understand the meaning of the flow of time, perhaps it’s best to start with T.S. Eliot.

Time present and time past
Are both perhaps present in time future
And time future contained in time past.
If all time is eternally present
All time is unredeemable.
What might have been is an abstraction
Remaining a perpetual possibility
Only in a world of speculation.
What might have been and what has been
Point to one end, which is always present.

Burnt Norton by T.S. Eliot

 

One Comment

Microsoft and Google: Wielding Hard and Soft Power

Vendor Lock In

Steve Lohr of the NY Times posted an interesting article on the economics of Google and Microsoft. As usual the Network Effect was front and center in the analysis. Bill Gates gets his props as the foremost applied economist of the 20th Century. For those keeping score at home, that would be the last century. According to Lohr, Google lays claim to the 21st century. But it’s Lohr’s extension of the metaphors of hard and soft power that open some new areas for conversation.

Microsoft is associated with hard power combined with the network effect. The idea is that through proprietary formats and an operating system, Microsoft created a lock in that couldn’t be broken. You can check out any time, but you can never leave. Interestingly, Microsoft’s network effect was created without the Network. Dominance was enforced at the Enterprise and OEM level, most users never actually had to buy a Microsoft product.

Google is associated with soft power. Users are free to leave at any time, no proprietary formats are used, but ongoing usage creates a form of addiction. The network effect enables the large scale harvesting user gestures to create a learning machine that constantly adapts their algorithms. The result is the ongoing incremental improvement of the value of their software products delivered through the Network. Switching costs are low, but finding better value is difficult.

The internet has detached the user experience from Microsoft’s hard power, and Google has created a cash machine located firmly within the Network. Microsoft won the 20th century battle for hard power, but the 21st century’s battle is over soft power. The major players have to dominate without lock in, and Microsoft is starting to pivot from hard power to soft power. The Yahoo play was part of that strategy, Live Mesh and Silverlight also move Microsoft up the stack to the level of the Network. To win in the soft power arena, you’ve got to play in the open and you’ve got to deliver more value. The other thing Microsoft needs is a source and engine for harvesting user gestures as an input to improving the value of the product.

The hard power metaphor is useful at looking at the lock in players that still have some dominance. The obvious move would be to look at the entertainment industry, but that game is largely over. It’s the Telcos that really still play hard ball with hard power. The iPhone is starting to break that lock as it floats above the telephony system and lets the Network dominate. Think about the raw usage percentages of the iPhone, how much telephony, how much Network? The big lie that the Telcos need you to believe is that voice data is special. They need to distract you from the fact that the Network is getting more and more real time and delivers multiple media types for a lower cost.

But the Telcos are safe until the internet identity problem is solved. Today you’re identified by a phone number. Tomorrow it may be OpenID or CardSpace, but you won’t need that phone number anymore. When the hard power war is over in that space, a huge wave of innovation will be unleashed. And you might be surprised about who’s leading that charge…

3 Comments