Rationalizations, Cocaine, and Cream

As I live in the Netherlands, chances are you’ll find me on a bicycle every now and then. And so it was that on a weekday after­noon, I found myself pedaling away into an insis­tent western wind trying to persuade me to go in the oppo­site direc­tion.

On this bike ride, which would turn out to be a memo­rable one, I was listening to a podcast inter­view. At one point, when the interviewee’s keen insights made me smile, and I did what I often do: use Siri to take a quick hands-free note for future refer­ence. So far, so good.

Swinging Sensually on the Song

The inter­esting idea I wanted to hold on to was this: what we call “ratio­nal­iza­tions” (post hoc, often invol­un­tary) and “framing” (proac­tive, delib­erate) are essen­tially the same: an attempt by our brain to convince us that we’re okay, that we made the right choice, that what we have is better than what we’ve missed out on. Fascinating, but other­wise inci­dental to this story. Read on!

The routine I went through may be familiar to iPhone-savvy readers: I double-tapped my AirPods, waited for the Siri chime, said “Take a note,” heard “What do you want it to say?”, and then dictated a rough, impro­vised version of my thoughts.

As a side note, I love these high-tech tricks and hacks because they’re conve­nient time-savers—but mostly, if I’m being honest, because they give me the feeling that I’m actu­ally living in a world that was pure science fiction when I was a kid. It’s the ulti­mate Look mom, no hands! And usually, it works like a charm: Siri takes your note and reads it back to you.

In this case, however, the system’s finely tuned voice-recog­ni­tion was thwarted as the micro­phone also picked up the rumbling noise of the wind all around me. I cannot remember the exact words I spoke, but this is what Siri penned down in my new note: 

Rationalizations, what time are you swinging sensu­ally on the song: 0792 entrances Lindsay better music reflects. 

Say what? But if at first you don’t succeed…

Take two

That clearly wasn’t what I wanted to say, so I tried again. Already suspecting that the wind noise was the culprit, I decided to go for a pithy, key-word approach. Again, I do not recall my exact words, but this is what my phone turned them into:

Rationalizations, Cocaine, and Cream

The first word was spot-on, strike one for so-called AI. But then… cocaine and cream?! Siri, what gives? This sounded like a new Häagen-Dazs ice-cream flavor straight out of some demented alter­na­tive universe.

I gave up. By now I’d reprocessed my thoughts often enough that I didn’t need the reminder note anymore. Cocaine and cream it is, indeed. Rationalize that!

A Hallowed Domain

In the moment, I was a bit frus­trated at my inability to dictate my thoughts success­fully to Siri. But later, reading those two botched attempts to record my speech, I marveled at the complexity of it all.

It is amazing that computer dicta­tion gets so many things right most of the time. Artificial intel­li­gence is nowhere near capable of having actual conver­sa­tions with us, but the trans­la­tion of sounds into spelling has come a long way.

Maybe it’s the software’s lack of an actual general intel­li­gence that frees it up to simply brute-force its way into the hallowed domain of human language. Read those sentences again. 

  • Rationalizations, what time are you swinging sensu­ally on the song: 0792 entrances Lindsay better music reflects.
  • Rationalizations, cocaine, and cream.

No human being in their right mind would ever write these words. Not while sober, anyway. But they are also not some random lucky draw from a dictio­nary data­base. 

This is, at best, a some­what struc­tured but ulti­mately miscar­ried attempt to do what ordi­nary people do non-stop from the time they are toddlers: convert an ongoing series of sound vibra­tions into mean­ingful symbolic repre­sen­ta­tions of coherent ideas about phys­ical reality. It’s as if a blind person were trying to describe the image in a painting simply by feeling the texture of the paint strokes with their finger­tips.

Epic Fail?

This little adven­ture in the land of voice-recog­ni­tion was humbling, in a way. Such “malfunc­tions” put into relief the funda­mental lack of humanity of the tech that surrounds us.

It is the purpose of tech­nology to extend the reach of human capa­bil­i­ties. The inven­tion of the wheel has let us trans­port heavier objects over larger distances. The advent of the bow and arrow has let us hunt prey—and humans—with greater accu­racy and effec­tive­ness.

But at its best, tech­nology does more than enhance our poten­tial: it can be a trans­for­ma­tive force that opens up new perspec­tives into our under­standing of the world. The micro­scope and tele­scope have let us peer into worlds well beyond the range of the human eye. And once we had unveiled the realm of the micro-organism, or the exis­tence of faraway galaxies, there was no going back. The mental land­scape in which we situate ourselves had changed irrev­o­cably.

Rationalization Redux 

As things stand, Siri et al. are not quite there yet. The arresting juxta­po­si­tion of words it distilled from my wind­blown dicta­tion is strangely evoca­tive, but it makes no sense. It couldn’t have—algorithms don’t do “sense”; they simple processes inputs. Even when the voice recog­ni­tion perfectly tran­scribes every word I say, it still doesn’t know what it means.

Conversely, it is our own attempts to try to rescue some semblance of rele­vance from the jumbled word soup that makes reading it so jarring. We want it to mean some­thing, we need it to make sense. And despite our best attempts at ratio­nal­iza­tion, possibly with some cocaine and cream, it doesn’t.

Now please excuse me while I go swing sensu­ally on the song.

• • •

Top image credit: They listen to the gramophone by Vladimir Makovsky (1910) (source)

Your thoughts