Skip to content →

Category: digital

Internet Identity: Speaking in the Third Person

It’s common to think of someone who refers to themselves in the third person as narcissistic. They’ve posited a third person outside of themselves, an entity who in some way is not fully identical with the one who is speaking. When we speak on a social network, we speak in the third person. We see our comment enter the stream not attributed to an “I”, but in the third person.

The name “narcissism” is derived from Greek mythologyNarcissus was a handsome Greek youth who had never seen his reflection, but because of a prediction by an Oracle, looked in a pool of water and saw his reflection for the first time. The nymph Echo–who had been punished by Hera for gossiping and cursed to forever have the last word–had seen Narcissus walking through the forest and wanted to talk to him, but, because of her curse, she wasn’t able to speak first. As Narcissus was walking along, he got thirsty and stopped to take a drink; it was then he saw his reflection for the first time, and, not knowing any better, started talking to it. Echo, who had been following him, then started repeating the last thing he said back. Not knowing about reflections, Narcissus thought his reflection was speaking to him. Unable to consummate his love, Narcissus pined away at the pool and changed into the flower that bears his name, the narcissus.

The problem of internet identity might easily be solved by having all people and systems use the third person. A Google identity would be referred to within Google in the third person, as though it came from outside of Google. Google’s authentication and authorization systems would be decentralized into an external hub, and Google would use them in the same way as a third party. Facebook, Twitter, Microsoft, Apple and Yahoo, of course, would follow suit. In this environment a single internet identity process could be used across every web property. Everyone is a stranger, everyone is from somewhere else.

When we think of our electronic identity on the Network, we point over there and say, “that’s me.” But “I” can’t claim sole authorship of the “me” at which I gesture. If you were to gather up and value all the threads across all the transaction streams, you’d see that self-asserted identity doesn’t hold a lot of water. It’s what other people say about you when you’re out of the room that really matters.

What does it matter who is speaking, someone said, what does it matter who is speaking?
Samuel Beckett, Texts for Nothing

Speaking in the third person depersonalizes speech. Identity is no longer my identity, instead it’s the set of qualities that can be used to describe a third person. And if you think about the world of commercial transactions, a business doesn’t care about who you are, they care if the conditions for a successful transaction are present. Although they may care about collecting metadata that allows them to predict the probability that the conditions for a transaction might recur.

When avatars speak to each other, the conversation is in the third person. Even when the personal pronoun “I” is invoked, we see it from the outside. We view the conversation just as anyone might.

Comments closed

Human Factors: Zero, One, Infinity

Software is often designed with three “numbers” in mind: zero, one and infinity. In this case, infinity tends to mean that a value can be any number. There’s no reason to put random or artificial limits on what a number might be. This idea that any number might do is at the bottom of what some people call information overload. For instance, we can very easily build a User Managed Access (UMA) system with infinite reach and granularity. Facebook, while trying to respond to a broad set of use cases, produced an access control / authorization system that answered these use cases with a complex control panel. Facebook users largely ignored it, choosing instead to wait until something smaller and more usable came along.

Allow none of foo, one of foo, or any number of foo.

Privacy is another way of saying access control or authorization. We tend to think about privacy as personal information that is unconnected, kept in a vault that we control. When information escapes across these boundaries without our knowledge, we call this a data breach. This model of thinking is suitable for secrets that are physically encoded on paper or the surface of some other physical object. Drama is injected into this model when a message is converted to a secret code and transmitted. The other dramatic model is played out in Alfred Hitchcock’s The 39 Steps, where a secret is committed to human memory.

Personal information encoded in electronic communications systems on the Network is always already outside of your personal control. This idea of vaults and breaching boundaries is a metaphor imported from a alien landscape. When we talk about privacy in the context of the Network, it’s more a matter of knowing who or what has access to your personal information; who or what can authorize access to your personal information; and how this leg is connected to the rest of the Network. Of course, one need only Google oneself, or take advantage of any of the numerous identity search engines to see how much of the cat is already out of the bag.

The question arises, how much control do we want over our electronic personal information residing on the Network? Each day we throw off streams of data as we watch cable television, buy things with credit cards, use our discount cards at the grocery, transfer money from one account to another, use Twitter, Facebook and Foursquare. The appliances in our homes have unique electrical energy-use signatures that can be recorded as we turn on the blender, the toaster or the lights in the hallway.

In some sense, we might be attempting to recreate a Total Information Awareness (TIA) system that correlates all data that can be linked to our identity. Can you imagine managing the access controls for all these streams of data? It would be rather like having to consciously manage all the biological systems of our body. A single person probably couldn’t manage the task, we’d need to bring on a staff to take care of all the millions of details.

Total Information Awareness would be achieved by creating enormous computer databases to gather and store the personal information of everyone in the United States, including personal e-mails, social network analysis, credit card records, phone calls, medical records, and numerous other sources, without any requirement for a search warrant. This information would then be analyzed to look for suspicious activities, connections between individuals, and “threats”. Additionally, the program included funding for biometric surveillance technologies that could identify and track individuals using surveillance cameras, and other methods.

Here we need to begin thinking about human numbers, rather than abstract numbers. When we talk about human factors in a human-computer interaction, generally we’re wondering how flexible humans might be in adapting to the requirements of a computer system. The reason for this is that humans are more flexible and adapt much more quickly than computers. Tracing the adaptation of computers to humans shows that computers haven’t really made much progress.

Think about how humans process the visual information entering our system through our eyes. We ignore a very high percentage of it. We have to or we would be completely unable to focus on the tasks of survival. When you think about the things we can truly focus our attention on at any one time, they’re fewer than the fingers on one hand. We don’t want total consciousness of the ocean of data in which we swim. Much like the Total Information Awareness system, we really only care about threats and opportunities. And the reality, as Jeff Jonas notes, is that while we can record and store boundless amounts of data— we have very little ability to make sense of it.

Man continues to chase the notion that systems should be capable of digesting daunting volumes of data and making sufficient sense of this data such that novel, specific, and accurate insight can be derived without direct human involvement.  While there are many major breakthroughs in computation and storage, advances in sensemaking systems have not enjoyed the same significant gains.

When we admire simplicity in design, we enjoy finding a set of interactions with a human scale. We see an elegant proportion between the conscious and the unconscious elements of a system. The unconscious aspects of the system only surface at the right moment, in the right context. A newly surfaced aspect displaces another item to keep the size of focus roughly the same. Jeff Jonas advocates designing systems that engage in perpetual analytics, always observing the context to understand what’s changed, the unconscious cloud is always changing to reflect the possibilities of the conscious context.

We’re starting to see the beginnings of this model emerge in location-aware devices like the iPhone and iPad. Mobile computing applications are constantly asking about location context in order to find relevant information streams. Generally, an app provides a focused context in which to orchestrate unconscious clouds of data. It’s this balance between the conscious and the unconscious that will define the new era of applications. We’ll be drawn to applications and platforms, that are built with human dimensions— that mimic, in their structure, the way the human mind works.

Our lives are filled with infinities, but we can only live them because they are hidden.

2 Comments

The Nature Of The Good And The Neutrality Of The ‘Check-In’ Gesture

“Just checking in.” It’s such a neutral phrase. It doesn’t imply any engagement or transaction— the connection has been opened and tested, but no activity is required or expected. From a Unix command line, the ping serves a similar function. The social geo-location services have brought the “check in” into common parlance on the Network. The FourSquare check in can be a neutral communication— no message attached, merely a statement that I’m at such-and-such a location.

The neutrality of the “check in” gesture began to interest me as I started thinking about the explicit gesture of giving a star rating to a restaurant. While I was recently visiting New York City, I decided to try and make use of the Siri and FourSquare apps on my iPhone. I could be observed sitting on a park bench saying ‘good pizza place near here’ into my iPhone and eagerly waiting for Siri to populate a list of restaurant options. I also checked in using FourSquare from several locations around Manhattan. When Siri returned its list of ‘good pizza places’ near me, it used the services of partner web sites that let users rate restaurants and other businesses on a one to five star system. When I asked for good pizza places that translated into the restaurants with the most stars.

The interesting thing about user ratings of businesses by way of the Network is that it’s completely unnecessary for the user to actually visit, or be a customer of, the business. The rating can be entirely fictional. Unless you personally know the reviewer and the context in which the review is proffered— a good, bad or ugly review may be the result of some alternate agenda. There’s no way to determine the authenticity of an unknown, or anonymous, reviewer. Systems like eBay have tried to solve this problem using reputation systems. Newspapers have tried to solve this problem by hiring food critics who have earned the respect of the restaurant ecosystem.

So, while Siri did end up recommending a good Italian restaurant, the Chinese restaurant it recommended was below par. Both restaurants had the same star ratings and number of positive reviews. This got me thinking about the securitization of the networked social gesture. Once a gesture has even a vaguely defined monetary value there’s a motivation to game the system. If more stars equals a higher ranking on Siri’s good pizza place list, then how can a business get more stars? What’s the cost?

I ran across a tweet that summed up the dilemma of wanting a list of ‘good pizza places’ rather than simply ‘pizza places.’ I use FriendFeed as a Twitter client, and while watching the real-time stream I saw an interesting item float by. Tara Hunt retweeted a micro-message from Deanna Zandt referring to a presentation by Randy Farmer at the Web 2.0 conference on Building Web Reputation systems. Deanna’s message read: “If you show ppl their karma, they abuse it.” When reputation is assigned a tradable value, it will be traded. In this case, ‘abuse’ means traded in an unintended market.

Another example of this dilemma cropped up in a story Clay Shirky told at the Gov 2.0 summit about a day care center. The day care center had a problem with parents who arrived late to pick up their children. Wanting to nip the problem in the bud, they instituted a fine for late pick up. What had been a social contract around respecting the value of another person’s time was transformed into a new service with a set price tag. “Late pick up” became a new feature of the day care center, and those parents who could afford it welcomed the flexibility it offered them. Late pick ups tripled, the new feature was selling like hot cakes. Assigning a dollar value to the bad behavior of late pick ups changed the context from one of mutual respect to a payment for service. Interestingly, even when the fines were eliminated, the higher rate of bad behavior continued.

Now let’s tie this back to the neutral gesture of the check in. While in some respect the reporting of geolocation coordinates is a mere statement of fact— there’s also the fact that you’ve chosen to go to the place from which you’ve checked in. There’s a sense in which a neutral check in from a restaurant is a better indicator of its quality than a star rating accompanied by explicit user reviews. If a person in my geo-social network checks in from a restaurant every two weeks, or so, I’d have to assume that they liked the restaurant. The fact that they chose to go there more than once is a valuable piece of information to me. However when game mechanics are assigned to the neutral check in gesture, a separate economics is overlaid. If the game play, rather than the food, provides the motivation for selecting a restaurant, then the signal has been diluted by another agenda.

By binding the check in to the place via the geolocation technology of the device, a dependable, authentic piece of information is produced. Social purchase publishing services, like Blippy, take this to the next level. Members of this network agree to publish a audit trail of their actual purchases. By linking their credit card transaction report in real time to a publishing tool, followers know what a person is actually deciding to purchase. A pattern of purchases would indicate some positive level of satisfaction with a product or service.

The pattern revealed in these examples is that the speech of the agent cannot be trusted. So instead we look to the evidence of the transactions initiated by the agent, and we examine the chain of custody across the wire. A check in, a credit card purchase— these are the authentic raw data from which an algorithm amalgamates some probability of the good. We try to structure the interaction data such that it has the form of a falsifiable proposition. The degree to which a statement of quality can be expressed as an on or off bit defines a machine’s ability to compute with it. A statement that is overdetermined, radiating multiple meanings across multiple contexts doesn’t compute well and results in ambiguous output. The pizza place seems to occupy multiple locations simultaneously across the spectrum of good to bad.

Can speech be rehabilitated as a review gesture? I had a short conversation with Randy Farmer at the recent Internet Identity Workshop (IIW 10) about what he calls the “to: field” in networked communications. The basic idea is that all speech should be directed to some individual or group. A review transmitted to a particular social group acquires the context of the social relations within the group. Outside of that context its value is ambiguous while purporting to be clear. Farmer combines restricted social networks and falsifiable propositions in his post ‘The Cake is a Lie” to get closer to an authentic review gesture and therefore a more trustworthy reputation for a social object.

Moving through this thought experiment one can see the attempt to reduce human behavior and social relations to falsifiable, and therefore computable, statements. Just as a highly complex digital world has been built up out of ones and zeros, the search for a similar fundamental element of The Good is unfolding in laboratories, research centers and start ups across the globe. Capturing the authentic review gesture in a bottle is the new alchemy of the Network.

What’s So Funny About Peace, Love and Understanding?
Nick Lowe

As I walk through
This wicked world
Searching for light in the darkness of insanity.

I ask myself
Is all hope lost?
Is there only pain and hatred, and misery?

And each time I feel like this inside,
There’s one thing I wanna know:
What’s so funny about peace love & understanding? ohhhh
What’s so funny about peace love & understanding?

And as I walked on
Through troubled times
My spirit gets so downhearted sometimes
So where are the strong
And who are the trusted?
And where is the harmony?
Sweet harmony.

Cause each time I feel it slipping away, just makes me wanna cry.
What’s so funny bout peace love & understanding? ohhhh
What’s so funny bout peace love & understanding?

So where are the strong?
And who are the trusted?
And where is the harmony?
Sweet harmony.

Cause each time I feel it slippin away, just makes me wanna cry.
What’s so funny bout peace love & understanding? ohhhh
What’s so funny bout peace love & understanding? ohhhh
What’s so funny bout peace love & understanding?

4 Comments

The Enculturation of the Network: Totem and Taboo

Thinking about what it might mean to stand at the intersection of technology and the humanities has resulted in an exploration with a very circuitous route.

The Network has been infused with humanity, with every aspect of human character— the bright possibilities and the tragic flaws.

On May 29, 1919, Arthur Stanley Eddington took some photographs of a total eclipse of the sun. Eddington had gone to Africa to conduct an experiment that might determine whether Newton’s or Einstein’s model was closer to physical reality.

During the eclipse, he took pictures of the stars in the region around the Sun. According to the theory of general relativity, stars with light rays that passed near the Sun would appear to have been slightly shifted because their light had been curved by its gravitational field. This effect is noticeable only during eclipses, since otherwise the Sun’s brightness obscures the affected stars. Eddington showed that Newtonian gravitation could be interpreted to predict half the shift predicted by Einstein.

My understanding of the physics is rather shallow, my interest is more in the metaphorics— in how the word-pictures we use to describe and think about the universe changed based on a photograph. Where the universe lined up nicely on a grid before the photograph, afterwards, space became curvaceous. Mass and gravity bent the space that light passed through. Assumed constants moved into the category of relativity.

The Network also appears to be composed of a neutral grid, its name space, through which passes what we generically call payloads of “content.” Each location has a unique identifier; the only requirement for adding a location is that its name not already be in use. You can’t stand where someone is already standing unless you displace them. No central authority examines the suitability of the node’s payload prior to its addition to the Network.

The universe of these location names is expanding at an accelerating rate. The number of addresses on the Network quickly outstripped our ability to both put them into a curated index and use, or even understand, that index. Search engines put as much of the Network as they can spider into the index and then use software algorithms to a determine a priority order of the contents of the index based on keyword queries. The search engine itself attempts to be a neutral medium through with the nodes of the Network are prioritized based on user query input.

Regardless of the query asked, the method of deriving the list of prioritized results is the same. The method and production cost for each query is identical. This kind of equal handling of Network nodes with regard to user queries is the search engine equivalent of freedom, opportunity and meritocracy for those adding and updating nodes on the Network. The algorithms operate without prejudice.

The differential value of the queries and prioritized link lists is derived through an auction process. The cost of producing each query/result set is the same—it is a commodity—but the price of buying advertising is determined by the intensity of the advertiser’s desire. The economics of the Network requires that we develop strategies for versioning digital commodities and enable pricing systems linked to desire rather than cost of production. Our discussions about “Free” have to do with cost-based pricing for digital information goods. However, it’s by overlaying a map of our desires on to the digital commodity that we start to see the contours, the curvaceousness of this space, the segments where versioning can occur.

We’ve posited that the search algorithm treats all nodes on the Network equally. And more and more, we take the Network to be a medium that can fully represent human life. In fact, through various augmented reality applications, human reality and the Network are sometimes combined into a synthetic blend (medium and message). Implicitly we also seem to be asserting a kind of isomorphism between human life and the Network. For instance, sometimes we’ll say that on the Network, we “publish everything, and filter later.” The gist of this aphorism is that where there are economics of low-or-no-cost production, there’s no need to filter for quality in advance of production and transfer to the Network. Everything can be re-produced on the Network and then sorted out later. But when we use the word “everything,” do we really mean everything?

The neutral medium of the Network allows us to disregard the payload of contents. Everything is equivalent. A comparison could be made to the medium of language— anything can be expressed. But as the Network becomes more social, we begin to see the shape of our society emerge within the graph of nodes. Sigmund Freud, in his 1913 book entitled Totem and Taboo, looks at the markers that we place on the border of what is considered socially acceptable behavior. Ostensibly, the book examines the resemblances between the mental life of savages and neurotics. (You’ll need to disregard the archaic attitudes regarding non-European cultures)

We should certainly not expect that the sexual life of these poor, naked cannibals would be moral in our sense or that their sexual instincts would be subjected to any great degree of restriction. Yet we find that they set before themselves with the most scrupulous care and the most painful severity the aim of avoiding incestuous sexual relations. Indeed, their whole social organization seems to serve that purpose or to have been brought into relation with its attainment.

Freud is pointing to the idea that social organization, while certainly containing positive gestures, reserves its use of laws, restrictions and mores for the negative gesture. The structure of societal organization to a large extent rests on what is excluded, what is not allowed. He finds this common characteristic in otherwise very diverse socio-cultural groups. Totems and taboos bend and structure the space that our culture passes through.

In the safesearch filters employed by search engines we can see the ego, id and superego play out their roles. When we search for transgressive content, we remove all filtering. But presumably, we do, as a member of a society, filter everything before we re-produce it on the Network. Our “unfiltered” content payloads are pre-filtered through our social contract. Part of the uncomfortableness we have with the Network is that once transgressive material is embodied in the Network, the algorithms disregard any difference between the social and the anti-social. A boundary that is plainly visible to the human— and is in fact a structural component of its identity and society, is invisible to the machine. Every node on the Network is processed identically through the algorithm.

This issue has also been raised in discussions about the possibility of artificial intelligence. In his book Mirror Worlds, David Gelernter discusses a key difference between human memory and machine memory:

Well for one thing, certain memories make you feel good. The original experience included a “feeling good” sensation, and so the tape has “feel good” recorded on it, and when you recall the memory— you feel good. And likewise, one reason you choose (or unconsciously decide) not to recall certain memories is that they have “feel bad” recorded on them, and so remembering them makes you feel bad.

But obviously, the software version of remembering has no emotional compass. To some extent, that’s good: Software won’t suppress, repress or forget some illuminating case because (say) it made a complete fool of itself when the case was first presented. Objectivity is powerful.

Objectivity is very powerful. Part of that power lies in not being subject to personal foibles and follies with regard to the handling, sorting, connecting and prioritizing of data. The dark side of that power is that the objectivity of the algorithm is not subject to social prohibitions either. They simply don’t register. To some extent technology views society and culture as a form of exception processing, a hack grafted on to the system. As the Network is enculturated, we are faced with the stark visibility of terrorism, perversity, criminality, and prejudice. On the Network, everything is just one click away. Transgression isn’t hidden in the darkness. On the Network, the light has not yet been divided from the darkness. In its neutrality there is a sort of flatness, a lack of dimensionality and perspective. There’s no chiaroscuro to provide a sense of volume, emotion, limit and mystery.

And finally here’s the link back to the starting point of this exploration. A kind of libertarian connection has been made between the neutral quality of the medium of the Network and our experience of freedom in a democratic republic. The machine-like disregard for human mores and cultural practices is held up as virtue and example for human behavior. No limits can be imposed on the payloads attached to any node of the Network. The libertarian view might be stated that the fewest number of limitations should be applied to payloads while still maintaining some semblance of society. Freud is instructive here: our society is fundamentally defined by what we exclude, by what we leave out, and by what we push out. While our society is more and more inclusive, everything is not included. Mass and gravity bend the space that light passes through.

The major debates on the Network seem to line up with the contours of this pattern. China excludes Google and Google excludes China. Pornographic applications are banished from Apple’s AppStore. Android excludes nothing. Closed is excluded by Open, Open is included by Closed. Spam wants to be included, users want to exclude spam. Anonymous commenters and trolls should be excluded. Facebook must decide what the limits of speech are within the confines of its domain. The open internet excludes nothing. Facebook has excluded the wrong thing. The open internet has a right to make your trade secrets visible. As any node on the Network becomes a potential node in Facebook’s social/semantic graph, are there nodes that should be taboo? How do we build a civil society within the neutral medium of the Network? Can a society exist in which nothing is excluded?

In the early days of the Network, it was owned and occupied by technologists and scientists. The rest of humanity was excluded. As the Network absorbs new tribes and a broader array of participants, its character and its social contract has changed. It’s a signal of a power shift, a dramatic change in the landscape. And if you happen to be standing at the crossroads of technology and the humanities, you might have a pretty good view of where we’re going.

One Comment