The headline reads: “Introducing the Knowledge Graph: things, not strings.” The implication being, “strings” are bad and limited and “things” are good and what you really wanted all along. After all people don’t want strings of arbitrary alpha-numeric characters in response to their queries, they want the things they’re looking for. And as the advertising message at the end of the introduction says, because you’re getting “things and not strings” on your search result pages, you can spend more time doing the things you love. Who wouldn’t want to do that? The end result of this technological improvement is that your life now contains “more time”— like a toothpaste tube that contains 20% more toothpaste; and that time is filled with love. One might even recast this new product as a machine for filling the world with love.
What Google seems to be introducing is a new user interface to a faceted search. Nothing more. Faceted search acknowledges that the “word” (a single string of characters) isn’t the atom of meaning. Instead it uses the “phrase” in the context of some domain of meaning—a word can be a valid token in multiple systems of meaning. These domains, or facets of meaning, are surfaced and prioritized in search results. So, in addition to Page-ranked links, we get a prioritized set of contexts in which a particular word or phrase is a valid operator. The advance is in creating an index of sub-domains of meaning through analyzing the structure of text as it’s used on the visible Network. There’s no question that faceted search is superior to classic Page-ranked search, however the language used to describe this new product innovation seems to suggest some kind of transcendent experience.
Here’s a description of the vision that drives innovation in the search product at Google:
We’ve always believed that the perfect search engine should understand exactly what you mean and give you back exactly what you want
– Amit Singhal, SVP, Engineering at Google
But when I hear this kind of talk from engineers, their words are drowned out by the characters from Lewis Carroll’s “Through the Looking Glass“:
‘I don’t know what you mean by “glory”,’ Alice said.
Humpty Dumpty smiled contemptuously. ‘Of course you don’t — till I tell you. I meant “there’s a nice knock-down argument for you!”‘
‘But “glory” doesn’t mean “a nice knock-down argument”,’ Alice objected.
‘When I use a word,’ Humpty Dumpty said, in rather a scornful tone, ‘it means just what I choose it to mean — neither more nor less.’
‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’
‘The question is,’ said Humpty Dumpty, ‘which is to be master — that’s all.’
Alice was too much puzzled to say anything; so after a minute Humpty Dumpty began again. ‘They’ve a temper, some of them — particularly verbs: they’re the proudest — adjectives you can do anything with, but not verbs — however, I can manage the whole lot of them! Impenetrability! That’s what I say!’
‘Would you tell me please,’ said Alice, ‘what that means?’
‘Now you talk like a reasonable child,’ said Humpty Dumpty, looking very much pleased. ‘I meant by “impenetrability” that we’ve had enough of that subject, and it would be just as well if you’d mention what you mean to do next, as I suppose you don’t mean to stop here all the rest of your life.’
‘That’s a great deal to make one word mean,’ Alice said in a thoughtful tone.
‘When I make a word do a lot of work like that,’ said Humpty Dumpty, ‘I always pay it extra.’
We can propose the idea that Google has a search engine that “understands exactly what you mean.” And by this what we mean is that your query corresponds to a sub-domain in the index of facets Google has previously collected. The “meaning” doesn’t lie in the “you” that has the query, but rather in the sets of sub-domains contained in Google’s index. When a word does a lot of work in multiple sub-domains of meaning, they pay extra in compute time.
The claim that Google makes is that they’ve gone from “strings” to “things.” But the sub-domains of meaning that Google is collecting are made up of computable sets of strings, not things. The leap that Google is actually trying to make is from “strings” to “words, phrases and contexts.” But the use of the word “thing” is very revealing. Words are not things, they are indexes. They point at things, suggest things, or function in a play of difference within a system of meaning. When we say that we’ve gone from “strings” to “things” we’re actually making a kind of miraculous claim. We’ve gone from “word” to “thing.” The most prominent example of this algorithm can be found in the King James Bible, we see it in John 1.14:
And the Word was made flesh, and dwelt among us, (and we beheld his glory, the glory as of the only begotten of the Father,) full of grace and truth.
If we believe that Google’s knowledge graph provides “things” and not “strings,” we also believe something extraordinary about the power and capability of Google. Even if we take a step back and simply say that Google is merely indexing sub-domains—systems of meaning, we need to examine what this means. We could follow Wittgenstein and say that “meaning” can be described as a form of life. Therefore Google’s index produces a prioritized list of facets (forms of life) that connect to your form of life, given what they know about you. Popular forms of life that don’t currently connect to you serve as a method of discovery.
Of course, there’s also the popular trend of the flesh made word…
There are registers of meaning that Google’s approach will never capture. Their index will be filled with gaps and pools of darkness. In particular, only a very limited range of metaphor (cliches) will be caught in the net. Metaphor produces meaning through an algorithmic process (per @the_eco_thought, Tim Morton). Take a noun, take another noun from a different domain and place the word “is” between them. The coffee cup is a blue angel. The metaphor machine makes meaning. Not every metaphor is a good one, but it has some modicum of meaning and it does function as a metaphor.
Like the theoretical one hundred monkeys typing in front of a hundred typewriters for a hundred years, the metaphor machines are constantly operating and feeding the Network with new meaning. Darius Kazemi (@tinysubversions) has created a machine called “Metaphor a Minute” that does just this. You can follow it on Twitter at @metaphorminute. Of course, because of Twitter’s rate limits, there’s actually a new metaphor published every two minutes.
“Hold the newsreader’s nose squarely, waiter, or friendly milk will countermand my trousers.”
After thinking through Google’s new service and the language they’ve used to describe it, we discover that they are using the word “things” metaphorically. At first, we may assume that when engineers are describing the function of their new software, they’re making literal statements about what the machine they’ve constructed is doing. Instead, they’ve taken a two nouns from different domains and inserted the word “is” between them. Ironically, their use of the word “things” is of the type that their new service could not understand it. The narrow band of search engine results that are produced by this system is also being metaphorically called “knowledge.” In order to see these new products clearly, we need to be able to differentiate the rhetoric of hyperbole from the literal functioning of the machine. It also helps to become acquainted how metaphors mean…