A version of this article was featured in Quirk's Newsletter & Blog.
As AI knocks at the door, let’s acknowledge that the robots are already living within us.
In December of 1847, the United Kingdom officially adopted Greenwich Mean Time after years of groaning debate, confusion, and mishap. GMT provided order and standardization for another much-lauded system: the railways. In fact, it was originally known as “railway time.” When all clocks finally agreed, nobody would miss a train. But more importantly, the marriage of these two systems may have been pivotal in supercharging the Second Industrial Revolution. Imagine lumbering into a 20th century where Bristol is 8 minutes ahead of London, or with infrastructure unfit for the ‘machine age’?
Today we’re girding our loins for another kind of machine age, but this time it must get hitched to a more organic companion: humans. Because contemporary machines learn, our attention should really be on what we’re teaching them; on the psychological infrastructure we’re building on. The 20th century was obsessed with scale: mass media, mass production, and massive audiences. This scale demanded rigid systems to categorize, count and coerce. Demographics was born. Data was collected on everything and everyone and then segmented into crude brackets so government and businesses could efficiently communicate with them. But they weren’t really communicating with anyone. Not in a human sense. The 20th century abandoned talking for ‘broadcasting’, and then zealously mechanized a bunch of other human functions too (like empathizing, but we’ll come back to that!). By the year 2000, a cold and militarized lexicon was being ‘deployed’ to ‘engage’ and ‘execute’ against global consuming ‘targets’.
In one century, we successfully dehumanized huge swathes of human interaction, both in the commercial and social spheres. We turned citizens into robots in the name of efficiency and growth.
It’s no surprise that empathy levels plummeted in the last 40 years. The pipes are broken, or no longer designed for the way humans are meant to interact. It’s ironic that, just as our machines have almost grasped semantics, meaning, and emotional inferences, we have become the machines! As we brace for the AI age it will become imperative that we rediscover our innate human gifts so that we may bequeath them to our robot offspring.
I’m already seeing fervent petitions for a more human-centric AI-sphere as well as defensive speechifying about how “we won’t forget the people as the bots move in!”
Just like with the railways, we’re experiencing a chaotic moment of arguing about the necessary checks and balances. Unlike the railways, that chaos won’t be solved by more mechanisms. Artificial intelligence demands that our systems of emotional intelligence be up to scratch before we layer on the artificial. We must become humans again so that machines don’t blindly build on the rigidly dumb assumptions already embedded in our big data. Last week I saw how things can turn left when BuzzFeed published AI-generated Barbies from around the world. Due to biased datasets, we were served a South Sudanese Barbie wielding a machine gun and a German Barbie wearing what looked like an SS uniform. One cursory glance from a feeling human eventually saw fit to remove these, but it illustrated how much we need to partner with the machine.
In marketing, I’ve watched with sadness as linear thinking and scale-chasing suck out the life (and the talent) of this formerly creative profession. Studies show significant declines in funny advertising in the last two decades despite it being proven to be more effective. And, in the research business, we’ve allowed reductionism to define our desires and restrict our intuition and imagination. We’ve snuffed out the empathy that (nearly) every human was born with. And we’ve adopted a sterile language that loses sight of emotional beings. Who agreed to lump ‘Gen Z’ into one bucket of activism and disenchantment? Why are we still talking about demographics when we have sophisticated psychographics that tell us much more? What purpose do blunt segmentations serve when AI predicts our behaviors more forensically than ever?
The metaphors need to shift. Next-gen tech is less blunt, so we can remove the weaponry from our language, and add nuance, nature, and even irony (god forbid). The last few decades saw research participants as inanimate objects, training them to hide their true selves in the process. The future of research will be about intuition, prediction, and discernment of idiosyncrasies and contradictions. We can’t glean these from faceless robot people. We’ll need the whole human to show up. So, creativity, play, emotion, and art will play a much larger role.
I believe we’re about to move from a necessarily mechanistic world to a deeply intuitive one. Growth in AI, responsive design, human-centricity, and the migration of the social sciences and art into tech will assist this. But we must remove the legacy plumbing. The language, the systems, and the assumptions about humanity must change. Now that data sorting is cheaper, we can also be smarter with it. We can admit that large segments rarely do things in the same way and that two Boomers are not the same. Maybe we could even agree that ‘user’ is no way to describe a human with complex feelings.
There is some hope. Gen Z shows signs of a more human-centric worldview. In our many conversations with this, the newest cohort to enter the workforce, we’ve seen a complete upending of priorities. A renewed comfort with discussing emotion, mental health, and empathy for those unlike themselves is clear and already infecting wider society.
We’re about to build a whole new tech layer that will be able to organize us beyond our wildest dreams, so we must spend a moment reflecting on the layer beneath it: the human one.
Perhaps we might take a beat to remember how to ‘human’, maybe throw out brittle assumptions about our fellow sapiens that formed the backbone of society in the machine age. Let’s banish the robot within ourselves and revive our creative, feeling, intuiting, laughing selves before the decision is no longer ours to make.