Skip to content →

Category: interaction design

Surfing The Waves of Technology

The company vilified by some as being too closed to be successful in the long run has— in the long run, defined and distributed the dominant model for human-computer interaction. The reality is that all products that brandish the so-called open systems label are operating within the parameters set by Apple.

And while it’s certainly true that Apple didn’t create any of these interaction modes out of whole cloth, they codified them, shipped and sold the products that have turned them into defacto standards.

A de facto standard is a custom, convention, product, or system that has achieved a dominant position by public acceptance or market forces (such as early entrance to the market). De facto is a Latin phrase meaning “concerning the fact” or “in practice”.

In the beautiful silence emanating from Apple prior to the January 27, 2010 announcements a curious thing has happened. The full attention of the technical intelligensia has been focused on what’s missing from our personal and social computing experience. The announcements will be an interesting test of the ‘wisdom of the crowds.’ Theoretically, the predictions and analysis of the thousands of individuals writing about what will be announced could be distilled into either exactly the device Apple intends to release, or a blueprint for an even better device. My bet is that we will be surprised.

Of course, we can point to Xerox Parc, or Doug Engelbart, and say none of these things are new. But moving ideas from the lab to the street is a matter of knowing which dots to connect. In an interview, Jobs talks about recognizing the valuable waves of technology:

“Things happen fairly slowly, you know. They do. These waves of technology, you can see them way before they happen, and you just have to choose wisely which ones you’re going to surf. If you choose unwisely, then you can waste a lot of energy, but if you choose wisely it actually unfolds fairly slowly. It takes years.�

In order to connect dots, you need to be in a position to do so. Sometimes we tend to overlook the core skill set that Apple has amassed. Here’s Jobs talking about what Apple does:

“Well, Apple has a core set of talents, and those talents are: We do, I think, very good hardware design; we do very good industrial design; and we write very good system and application software. And we’re really good at packaging that all together into a product. We’re the only people left in the computer industry that do that. And we’re really the only people in the consumer-electronics industry that go deep in software in consumer products. So those talents can be used to make personal computers, and they can also be used to make things like iPods. And we’re doing both, and we’ll find out what the future holds.�

So, while we live in an era of “organizing without organizations,” can we expect distributed organizations harnessing the crowd to produce, sell and ship products at the same level as Apple? Crowds have a difficult time indicating what should be left out— and this is a key to superior industrial design. Here’s Job’s on Apple’s design process:

“Look at the design of a lot of consumer products—they’re really complicated surfaces. We tried make something much more holistic and simple. When you first start off trying to solve a problem, the first solutions you come up with are very complex, and most people stop there. But if you keep going, and live with the problem and peel more layers of the onion off, you can oftentimes arrive at some very elegant and simple solutions. Most people just don’t put in the time or energy to get there. We believe that customers are smart, and want objects which are well thought through.�

In 2007, Apple changed its name from Apple Computer to Apple. In some sense, this signaled the end of the era of the personal computer. The computer has begun its migration and blending into other devices— some existing, others yet to be invented. Here’s Jobs on where the revolution is going:

“I know, it’s not fair. But I think the question is a very simple one, which is how much of the really revolutionary things people are going to do in the next five years are done on the PCs or how much of it is really focused on the post-PC devices. And there’s a real temptation to focus it on the post-PC devices because it’s a clean slate and because they’re more focused devices and because, you know, they don’t have the legacy of these zillions of apps that have to run in zillions of markets.�

While there have been tablet computers for quite a long time, they were primarily designed as an evolution of the personal computer. In thinking about Apple’s announcement, the previous frame of reference is wrong— just as it is for those who believe the iPhone is a telephone. In looking at what’s missing from our social computing environment, we think we know the set of dots that need to be connected. But if we sit with the problem long enough, a whole new set of dots will come into focus. Here’s Jobs on vision and design:

“There’s a phrase in Buddhism,�Beginner’s mind.� It’s wonderful to have a beginner’s mind.�

One Comment

With Just A Wave Of Her Hand…

My thoughts have been swirling around the point of interaction for some time now. And by that I mean the point of human-computer interaction. To connect up the threads, at first, I’ve began looking backwards. Perhaps all the way to the Jacquard loom and the punch cards used to control the patterns, and then on to the punch cards used on the early mainframes.

I’m sure there were many steps in between, but my mind races ahead to the command line. This extremely powerful and elegant point of interaction has never really been superseded. It continues to be the favored mode of interaction for a number of software development activities. But it was the graphical user interface that provided a point of interaction that changed the medium.

Doug Engelbart’s 1968 demo of the work undertaken by the Augmentation Research Center (ARC) gives us all the fundamental modes of interaction. The keyboard, the mouse/trackpad, the headset, hypertext and the graphic user interface. Within that set of interaction points, we’ve started to expand the repertoire. With the introduction of the iPhone, the trackpad gesture has gained increasing importance.

On a separate track we’ve seen video games controllers become ever more complex. The point of interaction for the game starts to reflect the kind of speed and complexity we create in our virtual gaming worlds.

It’s with the Wii and Project Natal that we start to see the surface of the trackpad detached from the computing device, extruded into three dimensions, and then dematerialized. The interaction gestures can now be captured in the space around us. Originally, the graphic user interface (mouse clicks, windows, desktop) was criticized for the limitations it imposed.

The other key development was the displacement of computing from the local device to the Network of connected devices. The interaction point is now to a new Networked medium. This is the converged form of what McLuhan understood as television. The development of new interaction modes traces a path toward opening to greater numbers of participants the new medium. Beyond mass media, there is the media of connected micro-communities.

Popular culture and music culture has always had a big impact on the development of cutting-edge technology. When we think of controlling technology through hand gestures, we can start with the ether-wave theremin created by Leon Theremin.

And then there was Jimi Hendrix playing Wild Thing at Monterrey Pop, gesturing wildly to pull sound out his stratocaster.

This is one of those in-between moments. The wave unleashed by xerox-parc and the augmentation research center is about to be followed by a new wave. The signs are all around us.

Comments closed

Nexus One, iPhone and Designing For Sustainability

The technology news streams have been filled with coverage of the new Google phone called the Nexus One. It’s impact will be significant. Now there are two “phones” in the new landscape of mobile computing. Two are required to accelerate both innovation and diffusion of the technology. The Nexus One will both spur, and be spurred on by, the iPhone.

Much of the coverage has focused on comparisons of the two devices with regard to feature set and approach to the carriers. On the product strategy side, the story of the early Macintosh vs. Windows battle is being replayed by the pundits with Google cast in the role of Microsoft, and Android as the new Windows. The conventional wisdom is that Apple lost to Microsoft in the battle of operating systems, and that history will repeat itself with the iPhone.

A quick look at the top five U.S. companies by market capitalization shows Microsoft, Google and Apple holding down three of those spots. Apple’s so-called losing strategy has resulted in a market cap of $190 Billion and a strong, vibrant business. If history repeating itself leads to this kind of financial performance, I’m sure Apple would find that more than acceptable.

But it was watching Gary Hustwit’s film Objectified that brought forward a comparison that I haven’t seen in all the crosstalk. Following up his film, Helvetica, which documented the history of the typeface, Hustwit takes a look at the world of industrial design and designers:

Objectified is a feature-length documentary about our complex relationship with manufactured objects and, by extension, the people who design them. It’s a look at the creativity at work behind everything from toothbrushes to tech gadgets. It’s about the designers who re-examine, re-evaluate and re-invent our manufactured environment on a daily basis. It’s about personal expression, identity, consumerism, and sustainability.

Industrial design used to be about designing the look and feel of a product— the designer was brought in to make it pretty and usable. Now the whole lifecycle of the product is considered in the design process. I’ve found John Thackara’s book In The Bubble, and Bruce Sterling’s Shaping Things to be very eloquent on the subject. Looking beyond how the phone works for the user, there’s the environmental impact of the industrial manufacturing process and disposing of the phone at the end of its life.

It was Craig Burton’s Choix Vert Action Card that brought Apple’s policies on industrial design and the environment into view for me. While searching Google for something related to Apple, the Choix Vert card adds a thumbprint logo to socially responsible companies on the results page. Apple sports the Choix Vert mark, HTC, producer of the Nexus One, doesn’t. Currently Apple provides environmental impact reports for each of their products. Apple’s so-called ‘closed’ approach to their products results in a unique ability to control, not only the user experience, but how the product is manufactured, and what happens at the end of its life.

Google’s modular approach to their phone means they can claim they aren’t responsible for manufacturing or disposal. The Android phone run-time will be put on a variety of phones with manufactured by companies with varying degrees of social responsibility.

Early reports from users indicate that the Nexus One’s user interface could use a little more polish. I expect that will happen as the software is iterated and the user experience refined. But beyond feature sets and carrier costs, I hope Nexus One users will ask Google about the environmental impact of their phone.

Every year about 130 million cellphones are retired, for every Nexus One that’s purchased, it’s likely that another cell phone will go out of service. Google is now in the consumer hardware business, and that brings with it some responsibilities they aren’t used to considering. Given their corporate motto, I’m sure they’ll do the right thing.

http://en.wikipedia.org/wiki/Product_lifecycle_management
2 Comments

Sensing the Network: The Sound of the Virtual

Over lunch with Steve Gillmor the other day, the topic strayed to the dubbing of foreign films. It linked up to an earlier conversation with Aron Michalski about the digital editing of recordings of live music. Our live experience goes virtual as it moves into the past, sound and vision are no longer linked. They become arbitrarily coordinated streams of media. The soundtrack of a film can be completely replaced and the language spoken by the actors can be localized to particular audiences. Wrong notes or timing in a live music performance can be fixed in post production before a quick release to the Network. The period of latency between the live moment and its distribution through a channel provides the opportunity to match our desires with the physical artifact of production. We get a second bite at the apple.

The other instance where separate streams of sound and video are synchronized to create the appearance of a natural experience is when we have the expectation of sound. This is a common practice in science fiction films set in space. Floating through space, we hear the roar of the engines, the blast of the weapons, and the explosion of the enemy ships. Of course, space is a vacuum and sound vibrations can’t occur without a suitable medium. We dub in the sound that makes emotional sense— desire and experience are synchronized.

The mechanical vibrations that can be interpreted as sound are able to travel through all forms of matter: gases, liquids, solids, and plasmas. The matter that supports the sound is called the medium. Sound cannot travel through vacuum.

While we may consider outer space to be the final frontier, there’s another frontier that has opened in front of us that is being explored every day by ordinary people. The virtual space of the Network is all around us. When we type messages on our iPhones, we hear the sound of clicking keys; when we take digital photos we hear the sound of the shutter clicking; when we drive certain kinds of electric cars, we hear the sound of a gasoline engine.

The haptics of the virtual replicate the physics of the physical world. Events in the virtual space of code trigger a sound stream that has an experiential analogy in the physical world. We’ve virtualized complex mechanical interfaces with knobs, dials, sliders, and various data readouts. The dashboard is the holy grail of business intelligence. Some have even proposed a real-time dashboard as the new center of our computing experience.

Consider for a moment how we’ve begun to dub our virtual space to synchronize it with the physical space of our environment. My iPhone uses a traditional telephone ringing sound to signal when a call is coming through. I selected this sound from a menu of possible sounds. Actual telephones that contain metal bells that ring on an incoming call event are pretty rare these days. Many younger people have only experienced the virtual sound of the old telephone.

The link between sound and vision is arbitrary in the virtual world. Our cheap digital camera can sport a sound sample taken from the most expensive mechanical camera. What’s the sound of code executing? We extend the context from our mechanical physical universe into the virtual universe to give us a sense of which way is up, when something has started and when it’s finished. The sound track to the virtual is a matter of cultural practice, but it’s both variable and personalizible. However, as the mechanical recedes around us, our context also becomes fainter. Will the virtual always be a mirror world, or will some new practice emerge from the Network itself? Can a concept of natural sound be generated from a world where sound doesn’t naturally occur, but is rather always a matter of will?

Comments closed