« »

Contra Optimization: 4th Time Around

The whole train of thought started in the most unlikely spot. It’s a bit of a random walk, an attempt at moving in circles to get closer to a destination. I was listening to a podcast called ‘Sound Opinions‘ and Al Kooper was talking about the sessions in Nashville for Bob Dylan’s ‘Blonde on Blonde.’ They didn’t have a tape recorder, so Dylan would teach Kooper the changes and then Kooper would play them over and over again on a piano in Dylan’s hotel room. Dylan worked on the lyrics, Kooper played the changes and gradually, over many hours, the songs took shape.

Kristofferson described the scene: “I saw Dylan sitting out in the studio at the piano, writing all night long by himself. Dark glasses on,” and Bob Johnston recalled to the journalist Louis Black that Dylan did not even get up to go to the bathroom despite consuming so many Cokes, chocolate bars, and other sweets that Johnston began to think the artist was a junkie: “But he wasn’t; he wasn’t hooked on anything but time and space.”

Thinking about that process, I wondered if it would actually have been made better, more efficient, through the use of a tape recorder. Would the same or better songs have emerged from a process where a tape recorder mechanically reproduced the chord sequence as Dylan worked on the lyrics. Presumably, Kooper didn’t play like a robot, creating an identical sonic experience each time through. While Dylan and Kooper’s repetitive process eventually honed in on the song—narrowing the sonic field to things that seem to work—the resonances of the journey appear to be resident in the grooves. From this observation a question emerged: what is learned from a repetition that isn’t a mechanical reproduction, but rather a kind of performance? This kind of repetition seems to have the shape of a inward spiral.

We rush toward optimization and efficiency, those are the activities that increase the yield of value from our commerce engines. The optimal, by definition, means the best. Recently Nasism Taleb exposed the other side of optimization. When there’s a projected relative stability in an environment, as well as stable inputs and outputs for a system, optimization results in a higher, more efficient, production of value. In times of instability, change and uncertainty, optimization produces a brittle infrastructure that must use any excess value it generates to prop itself up in the face of unanticipated change. Unless there’s a reversion to the previous stable state, the system eventually suffers a catastrophic failure. Robustness in uncertain times has to be built from flexibility, agility and a managed portfolio of options. Any strategic analysis might first take note of whether one is living in interesting times or not.

Some paths of thought can’t be fully explored by using optimization techniques. We tend to run quickly toward what Tim Morton calls the “top object” or the “bottom object.” The top object is the most general systematic concept from whence comes everything (“anything you can do, I can do meta“). To create this kind of schema you need to find a place to stand that allows you to draw a circle around everything—except, of course, the spot on which you’re standing. The bottom object is the tiny fundamental bit of stuff—Democritus’s atom—from which all things are constructed. Although physics does seem to be having a tough time getting to the bottom of the bottom object—they keep finding false bottoms, non-local bottoms, anti-bottoms and all kinds of weird goings on. The idea that there may be ‘turtles all the way down’ no longer seems far fetched.

Moving in the opposite direction from a solid top or bottom, we run into Graham Harman’s presentation of Bruno Latour’s concept of irreducibility. Here’s Latour on the germ of the idea:

“I knew nothing, then, of what I am writing now but simply repeated to myself: ‘Nothing can be reduced to anything else, nothing can be deduced from anything else, everything may be allied to everything else. This was like an exorcism that defeated demons one by one. It was a wintry sky, and a very blue. I no longer needed to prop it up with a cosmology, put it in a picture, render it in writing, measure it in a meteorological article, or place it on a Titan to prevent it falling on my head […]. It and me, them and us, we mutually defined ourselves. And for the first time in my life I saw things unreduced and set free.”

In his book, Prince of Networks, Harman expands on Latour’s idea. No top object, no bottom object, just a encompassing field of objects that form a series of alliances:

“An entire philosophy is foreshadowed in this anecdote. every human and nonhuman object now stands by itself as a force to reckon with. No actor, however trivial, will be dismissed as mere noise in comparison with its essence, its context, its physical body, or its conditions of possibility. everything will be absolutely concrete; all objects and all modes of dealing with objects will now be on the same footing. In Latour’s new and unreduced cosmos, philosophy and physics both come to grips with forces in the world, but so do generals, surgeons, nannies, writers, chefs, biologists, aeronautical engineers, and seducers.”

The challenge of Latour’s and Harman’s thought is to think about objects without using the tool of reduction. It’s a strange sensation to think things through without automatically rising to the top, or sinking to the bottom.

Taking the principle in a slightly different direction we arrive at Jeff Jonas’s real-time sensemaking systems and a his view of merging and purging data versus an approach he calls entity resolution. Ask any IT worker about any corporate database and they’ll talk about how dirty the data is. It’s filled with errors, bad data, incompatibilities and it seems they can never get the budget to properly clean things up (disambiguation). The batch-based merge and purge system attempts to create a single correct version of the truth in an effort to establish the highest authority. Here’s Jonas:

“Outlier attribute suppression versus context accumulating: As merge purge systems rely on data survivorship processing they drop outlying attributes, for example, the name Marek might sometimes appear as Mark due to data entry error. Merge purge systems would keep Marek and drop Mark. Entity resolution systems keep all values whether they compete or not, as such, these systems accumulate context. By keeping both Marek and Mark, the semantic reconciliation algorithms can benefit by recognizing that sometimes Marek is recorded as Mark.”

Collecting the errors, versions and incompatibilities establishes a rich context for the data. The data isn’t always bright and shiny, looking its clear and unambiguous best—it has more life to it than that. It’s sorta like when you hear someone called by the wrong name, but you know who’s being talked about anyway. Maybe you don’t offer a correction, but simply continue the conversation.

And this brings us back to Al Kooper banging out the changes on a piano in a hotel room, while Dylan sits hunched over a typewriter, pounding out lyrics. Somehow out of this circling through the songs over and over again, the thin wild mercury sound of Blonde on Blonde eventually took hold in the studio and was captured on tape.

Plotting your route as the crow flies is one way to get to a destination. But I have to wonder if crows really do always fly as the crow flies.


  1. aronski | May 10th, 2011 | 7:34 pm

    @cgerrish The imperfect duplication leads to errors and innovation.Happy accidents become new cliches and classic moments. Human interface FTW.

  2. cgerrish | May 11th, 2011 | 5:31 am

    Duplication that isn’t duplication, a net for catching errors, listening for signal in the noise.