Skip to content →

Category: difference

Going Orbital: Content and its Discontents

The image is an arresting one, content orbiting around a new center of gravity. Cameron Koczon, is his essay Orbital Content, addresses what he sees as a new breed of bookmarklet: Instapaper, Svpply and Readability. These applications allow a user to extract content from its original location—and then copy and transmit it to a personal archive. With Instapaper and Readability this may involve using a bookmarklet on a desktop-based web browser to DVR a long-form article from a publisher’s web site for later consumption on an iPad. With Svpply, it may be extracting a favorite item from an online eCommerce site to create a cross-domain curated scrapbook stream that’s shared in a social media context. Content exists as free satellites, plucked from their originating orbits and placed into a personal orbit.

The bookmarklet application occupies an interesting space. It sits at the edge of a desktop application—the browser—and operates on “pages” requested by the browser from servers on the Network. Rather than simply creating a shortcut to a specific URL, the bookmarklet is javascript code that runs on the current page. Because it holds this outside position, it can mix together code from multiple domains, blending the intentions of the page publisher, data or snippets from other sources, and the intentions of the user. In the case of “orbital content,” the bookmarklet is the transport mechanism, the user pushes a button and specified content is instantly transported from the orbit of one sun to another. Koczon’s enthusiasm about these new bookmarklet-based services begs the question whether browserOS apps can gain awareness in the popular imagination.

While the idea of a new orbit is quite exciting, we still seem to be stuck with the word “content.” The very word “content” has engendered a certain amount of discontent. It turns the fire of the written word into a bland abstraction. It’s the equivalent of a factory that turns out “widgets.” It’s the sausage that’s ground through content management systems. When speaking of this new orbital stuff, content seems exactly the wrong word. Content is the thing which is contained—the stuff inside of the boundary. In the case of orbital content, we are not content. The word “content” also refers to a state of happiness—being content. There’s a sense of acceptance of conditions or circumstances, of acquiescence. In neither sense of the word can this new orbital stuff be called content. We do not acquiesce to its circumstances—we break it out of its container and pull it into a new orbit.

We treat the digital as another kind of analog medium. Since it can simulate anything, we extend the analog by simulating it with the digital. In vain, we then attempt to impose the natural boundaries of analog economics onto the digital. By stamping these limitations onto the digital, we put it into analog clothes and ask everyone to behave accordingly. To mass produce significant quantities of “identical” analog objects requires an industrial-scale factory. An identical digital object is forged each time a request is made to a web server on the Network. The original bits aren’t transported from here to there, copies of the bits are distributed to anyone who asks—production is reproduction, presentation is representation.

This is major Tom to ground control, I’m stepping through the door
And I’m floating in a most peculiar way
And the stars look very different today
Here am I floatin’ ’round my tin can far above the world
Planet Earth is blue and there’s nothing I can do

Though I’m past one hundred thousand miles, I’m feeling very still
And I think my spaceship knows which way to go

The digital can be passed by reference or by value over the Network. The first preserves the integrity of the data, always pointing back to a single source—pointers rather than copies of bits are passed around. An updated source doesn’t require an updated pointer. A source could change incrementally, or totally, and still employ the same pointers. The second method begins the process of entropy, time-stamped bits are copied and sent, but once out on the wire, their link to the source is broken. An updated source marks a difference with the distributed bits. When the source changes, the distributed copy inscribes a historical state of the source. With every incremental change to the source, the difference in the distributed copy grows.

The kind of copying done in the practice of orbital content is digital rather than analog. Traditionally, the value of a copy has been in its completeness and exactitude. A factory that turns out widgets that sport too much variance is said to have a quality assurance problem. When Instapaper copies, it copies the pointer for exactness and then only a portion of a web page. The digital can make wholes out of any parts—there’s a legible boundary between every one and zero. It can be compared to harvesting a food crop, the ears of corn are gathered and the stalk and roots are discarded. To a publisher, this is a description of stripping the business model from the editorial.

The publisher asks the digital to behave as previous mediums always have—ink, once it is imprinted on paper, has a permanent presentation. Television programs are broadcast and the screen passively plays them. We can take a pen and draw a mustache on a photo in the newspaper, or mute the sound of our favorite television show while the commercial plays, but there’s a higher bar to clipping out segments and reusing them for our own purposes. Analog forms of automating the process have proved too costly and cumbersome.

All the while we thrill and lament the rush of traditional media toward the digital, we still tend to view it through analog glasses. The web page as delivered from the server to the browser is meant to define an end point. The code is delivered and ready for presentation. The static page is given a sense of flow and time with AJAX-based page updates from backstage, altering the presentation in memory through manipulations of the document object model (DOM). The practice of orbital content takes the page, not as an endpoint, but as an input to a process. Shedding their analog clothes, the digital bits making up the page show themselves not as an ending, but as a potential beginning. Using Instapaper, I pipe a designated section of the page, the story I’d like to read later, to my reading room where it’s poured into the format I prefer for reading electronic documents. I define a new endpoint, but it could also be a potential starting point as some portion is shared in another context.

In general, the browser application space (bookmarklets) has made significant strides, it’s gained a cross-platform software infrastructure with Phil Windley’s event-driven scripting language, KRL (Kynetx). And Apple has taken Readablity and Instapaper seriously enough to incorporate similar functionality into the forthcoming browser operating system in Lion. This follows the historical pattern of fundamental features being absorbed into the infrastructure of the host platform. The larger picture is that “web pages” are now both machine readable and scriptable for individuals, something known to the spiders at Google for a long time. No need to wait for the so-called semantic web, the hooks are already there.

David Gelernter defined an alternative to the desktop metaphor called LifeStreams. Instead of named files in folders, inside of folders, inside of desk drawers—nothing needs to be named, things just appear in context in a time-stamped stream. Streams can be filtered by different contexts, organized in time rather than space. Future events put into the stream eventually pop up as something occurring today. With Facebook’s newsfeed, Twitter’s stream, the various photo and location services, we’ve become accustomed to dealing with ranked lists and time-ordered streams. Even the output of these new orbital content services generally takes the form a of stream. In other words, orbital content isn’t really orbital either.

But the metaphor is enchanting enough to do the thought experiment, to take the stream and bend it into a circular shape, an orbit. Timelines are one way of expressing time, but we also have a long history with circular time. We live through hours, days, weeks, months, seasons and years. These things we DVR for later, might actually take the shape of satellites circulating in a personal orbit. Sort of like editing and layering loops, but using more than digital samples of music. What goes around, comes around—imagine orbital content as orbiting content.

One Comment

Contra Optimization: 4th Time Around

The whole train of thought started in the most unlikely spot. It’s a bit of a random walk, an attempt at moving in circles to get closer to a destination. I was listening to a podcast called ‘Sound Opinions‘ and Al Kooper was talking about the sessions in Nashville for Bob Dylan’s ‘Blonde on Blonde.’ They didn’t have a tape recorder, so Dylan would teach Kooper the changes and then Kooper would play them over and over again on a piano in Dylan’s hotel room. Dylan worked on the lyrics, Kooper played the changes and gradually, over many hours, the songs took shape.

Kristofferson described the scene: “I saw Dylan sitting out in the studio at the piano, writing all night long by himself. Dark glasses on,” and Bob Johnston recalled to the journalist Louis Black that Dylan did not even get up to go to the bathroom despite consuming so many Cokes, chocolate bars, and other sweets that Johnston began to think the artist was a junkie: “But he wasn’t; he wasn’t hooked on anything but time and space.”

Thinking about that process, I wondered if it would actually have been made better, more efficient, through the use of a tape recorder. Would the same or better songs have emerged from a process where a tape recorder mechanically reproduced the chord sequence as Dylan worked on the lyrics. Presumably, Kooper didn’t play like a robot, creating an identical sonic experience each time through. While Dylan and Kooper’s repetitive process eventually honed in on the song—narrowing the sonic field to things that seem to work—the resonances of the journey appear to be resident in the grooves. From this observation a question emerged: what is learned from a repetition that isn’t a mechanical reproduction, but rather a kind of performance? This kind of repetition seems to have the shape of a inward spiral.

We rush toward optimization and efficiency, those are the activities that increase the yield of value from our commerce engines. The optimal, by definition, means the best. Recently Nasism Taleb exposed the other side of optimization. When there’s a projected relative stability in an environment, as well as stable inputs and outputs for a system, optimization results in a higher, more efficient, production of value. In times of instability, change and uncertainty, optimization produces a brittle infrastructure that must use any excess value it generates to prop itself up in the face of unanticipated change. Unless there’s a reversion to the previous stable state, the system eventually suffers a catastrophic failure. Robustness in uncertain times has to be built from flexibility, agility and a managed portfolio of options. Any strategic analysis might first take note of whether one is living in interesting times or not.

Some paths of thought can’t be fully explored by using optimization techniques. We tend to run quickly toward what Tim Morton calls the “top object” or the “bottom object.” The top object is the most general systematic concept from whence comes everything (“anything you can do, I can do meta“). To create this kind of schema you need to find a place to stand that allows you to draw a circle around everything—except, of course, the spot on which you’re standing. The bottom object is the tiny fundamental bit of stuff—Democritus’s atom—from which all things are constructed. Although physics does seem to be having a tough time getting to the bottom of the bottom object—they keep finding false bottoms, non-local bottoms, anti-bottoms and all kinds of weird goings on. The idea that there may be ‘turtles all the way down’ no longer seems far fetched.

Moving in the opposite direction from a solid top or bottom, we run into Graham Harman’s presentation of Bruno Latour’s concept of irreducibility. Here’s Latour on the germ of the idea:

“I knew nothing, then, of what I am writing now but simply repeated to myself: ‘Nothing can be reduced to anything else, nothing can be deduced from anything else, everything may be allied to everything else. This was like an exorcism that defeated demons one by one. It was a wintry sky, and a very blue. I no longer needed to prop it up with a cosmology, put it in a picture, render it in writing, measure it in a meteorological article, or place it on a Titan to prevent it falling on my head […]. It and me, them and us, we mutually defined ourselves. And for the first time in my life I saw things unreduced and set free.”

In his book, Prince of Networks, Harman expands on Latour’s idea. No top object, no bottom object, just a encompassing field of objects that form a series of alliances:

“An entire philosophy is foreshadowed in this anecdote. every human and nonhuman object now stands by itself as a force to reckon with. No actor, however trivial, will be dismissed as mere noise in comparison with its essence, its context, its physical body, or its conditions of possibility. everything will be absolutely concrete; all objects and all modes of dealing with objects will now be on the same footing. In Latour’s new and unreduced cosmos, philosophy and physics both come to grips with forces in the world, but so do generals, surgeons, nannies, writers, chefs, biologists, aeronautical engineers, and seducers.”

The challenge of Latour’s and Harman’s thought is to think about objects without using the tool of reduction. It’s a strange sensation to think things through without automatically rising to the top, or sinking to the bottom.

Taking the principle in a slightly different direction we arrive at Jeff Jonas’s real-time sensemaking systems and a his view of merging and purging data versus an approach he calls entity resolution. Ask any IT worker about any corporate database and they’ll talk about how dirty the data is. It’s filled with errors, bad data, incompatibilities and it seems they can never get the budget to properly clean things up (disambiguation). The batch-based merge and purge system attempts to create a single correct version of the truth in an effort to establish the highest authority. Here’s Jonas:

“Outlier attribute suppression versus context accumulating: As merge purge systems rely on data survivorship processing they drop outlying attributes, for example, the name Marek might sometimes appear as Mark due to data entry error. Merge purge systems would keep Marek and drop Mark. Entity resolution systems keep all values whether they compete or not, as such, these systems accumulate context. By keeping both Marek and Mark, the semantic reconciliation algorithms can benefit by recognizing that sometimes Marek is recorded as Mark.”

Collecting the errors, versions and incompatibilities establishes a rich context for the data. The data isn’t always bright and shiny, looking its clear and unambiguous best—it has more life to it than that. It’s sorta like when you hear someone called by the wrong name, but you know who’s being talked about anyway. Maybe you don’t offer a correction, but simply continue the conversation.

And this brings us back to Al Kooper banging out the changes on a piano in a hotel room, while Dylan sits hunched over a typewriter, pounding out lyrics. Somehow out of this circling through the songs over and over again, the thin wild mercury sound of Blonde on Blonde eventually took hold in the studio and was captured on tape.

Plotting your route as the crow flies is one way to get to a destination. But I have to wonder if crows really do always fly as the crow flies.

2 Comments

Ironic Architecture: The Audience And Its Double

My eyes trace the curve of a jet black line as it snakes across the paper. There’s a point at which the line stops and my eyes keep going, tracing the trajectory of where the line might have gone. It’s within the bounds of that short distance that we travel into the future. It’s this tracing that doesn’t trace anything that is the subject of this meditation.

“and now I can go on,” is the phrase Wittgenstein used to describe a certain relationship to a series. Given “2, 4, 6, 8, 10,” I think I can see where things are going. “Even positive integers” is a possible answer, but no matter what numbers come next, a logic can be found for it. If the number is 12, that’s one sort of logic; if it’s 22, that’s another. Based purely on the visible, the adjacent invisible can always be colored in with a reasonable pattern.

It turns out that perception works in a similar way. The gaps in our apprehension of the world are bridged, filled in, to create the sensation of the smooth flow of time and experience. We project ourselves into the future. And our memories make liberal use of sampling to construct a rational narrative to account for the dramatic beats of our lives occuring before this one.

While past is not necessarily prologue, if you have enough data on what ‘usually happens’ you can make an educated guess about what will happen next. Through a statistical analysis of big data, the trajectory of partial behavior can be made visible, and the completion of that behavior can be projected. Correlations in the data emerge to tell a story that is unavailable to any one individual. Here the life of the human becomes actuarial, a set of probabilities for the possibilities. Once the percentages of the probabilities have exhibited some durability, casino economics can be installed to manage the risk and profit from these tendencies. The owners and operators of big data systems have a private view into a higher-dimensional phase space. And despite what these organizations tell us about good and evil, they are purely commercial enterprises.

A big data interlude: capturing big data on the Network, used to be the province of spiders. In the search business, it was only through expedition, return and accumulation of pointers and meta-data that a sufficient store of big data could be created. With Twitter and Facebook big data is created second-by-second within the walls of a single location. It’s the users who do all the traveling, sending postcards and pointers back to the archive.

As the probabilities solidify, another landscape emerges—along with the building materials for another level of architecture. For instance, using the tendencies that behavioral finance has uncovered, Thaler and Sunstein suggest building architectures that frame choice in such a way that people are ‘nudged’ into getting with the program. The program might be putting a percentage of one’s salary into a 401k to fund their retirement, or selecting a healthy lunch at the school cafeteria. We tend to accept the default and choose the item put in our path. Sunstein and Thaler call this activity ‘Choice Architecture‘ because while an individual is free to make any choice, the selection set is tilted toward a particular policy agenda. This tilting toward a particular outcome is what they call “a nudge.”

I like to call it “Ironic Architecture,” because while any choice can theoretically be made, the character in this little story is unaware of the manipulation and tilting of the selection set. When the character accepts the nudge and acts as the statistical analysis suggests they might, another level of the story is being played out.

Here’s Fowler’s Modern Usage on irony:

“Irony is a form of utterance that postulates a double audience, consisting of one party that hearing shall hear and shall not understand, and another party that, when more is meant than meets the ear, is aware of both that more and of the outsider’s incomprehension.”

While we make a big show of talking about how we want to engage the rational needs and desires of a user in the networked hypertext environment, more and more we’re seeing choice architecture employed to win without fighting, to persuade without engaging in a rational discussion.

This kind of strategy plays out in a number of domains, in politics, it’s called framing, or a little more obscurely, heresthetic:

“Like rhetoric, heresthetic depends on the use of language to manipulate people. But unlike rhetoric, it does not require persuasion. ‘With heresthetic,’ according to Riker, “Conviction is at least secondary and often not involved at all. The point of an heresthetical act is to structure the situation so that the actor wins, regardless of whether or not the other participants are persuaded.”

Personal behavior data is being created and recorded at an ever increasing rate. The phrase ‘information exhaust’ is an apt description of the continuous inscription of our activities into digital media. And while we may think that some superior form of personalization will be available to us based on this large data set, it’s more likely that big data will yield correlations and trends that are built into our environments and make us characters in stories of which we are unaware.

Harry Brignull has coined the phrase ‘dark patterns’ for this kind of architecture. Brignull writes eloquently about Alan Penn’s lecture on the architecture of Ikea and how consumer movement through that environment results in the unfolding of a singular story that its characters are unaware of:

“What Ikea have done is taken away something which is very fundamental, evolved into us, and they’ve designed an environment that operates quite differently, given that we are forward facing people, embodied […] from the way it would happen if you just looked down from outer space. Its effect is highly disorienting.”

“Ikea is highly disorienting and yet there is only one route to follow. […] Before long, you’ve got a trolley full of stuff that is not the things that you came there for. Something in the order of 60% of purchases at Ikea are not the things that people had on their shopping list when they came in the first place. That’s phenomenal.”

The best minds of our generation are designing dark patterns to entangle us in a story in which we spend more than we intend. They’re also designing choice architectures to get us to save for retirement, eat a healthy diet, get immunizations and show up for school. But the conversation and the narrative is happening at a level we don’t have access to—rhetoric without argument.икони

Comments closed

Ah Sxip, We Hardly Knew Ye…

It was Dick Hardt who got me interested in user-centric identity with his great presentation on identity 2.0. It was funny, wise and asked some very intriguing questions. Dick recently announced he was pulling the plug on Sxipper, his ground-breaking “identity” product, that operated as a Firefox browser plugin. Dick doesn’t much like to use the word “identity” when discussing “identity.” He prefers to talk about identifers, the tokens we trade in authentication, authorization and other kinds of networked transactions. As a veteran of the Internet Identity Workshops, he knew that the word “identity” is overdetermined and tends to overflow the kind of boundaries required for productive technical discussions.

The difficulty of user-centric systems has always been the contradiction at their core. The user doesn’t own the technical infrastructure to support an identity system, so systems that pretended to be outside of the system of systems provided the third leg of the triangle of “user-centric” authentication. This system multiplied the number of players in the game—supposedly in order to shift the balance of power back to the user. From the user’s perspective it merely complicated something that was too complicated to begin with.

Sxipper took a different approach, one that is gaining some popularity now in the form of the personal data locker. Sxipper looked at every form a user encountered on the web as an opportunity to learn something new. If Sxipper already understood a form, it would ask you how you wanted it filled out. If you’d populated your persona bank, you could select the appropriate data set, and Sxipper would automatically fill out the form for you. Once you’d trained Sxipper to understand a form, you were all set. Sxipper users also benefitted from the community of users, if someone else had trained the form, you were also ready to go. Web transaction forms used by large populations were almost always already trained, at the margins you’d have to do the training yourself. But rather than assume all forms at a Network level (commercial transaction, web site sign up, authentication, etc) needed to be accounted for, Sxipper focused by design, on what people actually did. Translating transaction difference in the trenches. As you used Sxipper, the amount of transaction friction you experienced on the web was continuously reduced.

With all this personal data and preference information in a persona bank, you might get the idea that your data might be worth something—that you could trade it for valuable gifts, discounts and prizes. While this model does work, it only works for celebrities. The network celebrity hub with high numbers of links gain even more links through the phenomena of preferential attachment. Value flows to these hubs by virtue of their potential distribution power. For example, in Hollywood, if a star can ‘open a movie,’ they’re compensated for it.

The big networked systems derive value from their scale and the correlation data they unearth from the big data they custody. The patterns they produce through statistical analysis can be sold under various schemes. Primarily, target groups are sold to advertisers. For most members of the target set, their data is commodity. Subtract a member from the set and the pattern remains. Your personal data only has value in concert with all the other members who make up the set. As a single point on a graph, your data doesn’t describe a trend.

With all this data flying around, it would seem that personalization of user experiences would naturally follow. And to some extent, a form of this is happening, but it’s through common patterns, not through deep insight into personal data. Augmented reality (the normalization of reference delusions) attempts to personalize the physical space you move through by superimposing targeted advertising-sponsored hypermedia publications on the smear of spatio-temporal location coordinates surrounding you. Reality becomes shelf space, with brands fighting the visual merchandising war for a home in your selection set.

Back to Sxipper: In order to provide personal data from a persona bank to domesticated web forms and transaction interfaces, Sxipper had to sit in a particular spot on the Network. As a browser plugin, or App, as we call them these days, Sxipper could send and receive form training data to a central cloud; and combine that formal data with a user’s locally-stored encrypted personal data to fill in forms across many different sites. Rather than harvesting correlation data, Sxipper had no access to it’s user’s personal data stores; because of this it had no target audiences to sell.

The value of identity and gesture data has been an ongoing discussion in the internet identity community. It seems like there must be a business model in there somewhere. The digital deal, the gesture bank, the attention economy, root markets, vendor relationship management, and now personal data lockers have all explored the system (bank) and account model. Anonymized central bank data can still yield correlation data, the patterns, but it forgoes the regular distribution model except through a user opt in. It’s a business model that makes sense and honors user privacy, but has yet to be successfully implemented.

The Selector, along with the information and action card had a similar, but more general structure, as Sxipper. Essentially, it was a client-side application development environment. Information cards were the equivalent of personal data personas, and action cards extended the capability to run a much wider variety of personal scripts across data from multiple sources. But like Sxipper, Microsoft recently put the final nail in the Selector and information card.

There are a couple of ideas from Sxipper that I’d like to see survive its demise. The first is focusing on difference rather than identity. Sxipper used the crowd to build bridges between different interfaces and systems. The diversity of the Network environment is one of its strengths. Sxipper preserved the diversity, but made the complexity disappear. The other idea is, rather than generating personalization from a central data bank, create personalization from the user’s side of the glass. Think of personalization as emanating from the person, rather than the system.

One Comment