Skip to content
All posts

McData: Big data is speeding the decline of civilization

New technology always cops a beating from the complaining class. We are perennially bombarded with doomsday scenarios where infantilizing new tools render us intellectually and physically flabby. From the calculator to spell check. GPS to Googling (your assignment’s answers). In most cases, I don’t consider myself a complainer. In fact, I believe that most of these tools remove unnecessary admin, allowing us to think and connect better with other people. 

But there is a technology that sits underneath these that may be doing more damage than we thought. Big data. I’m not here to scold big tech for stealing our data without our consent. That is being worked on. There’s a more insidious side effect of running all our decisions on one kind of data. And it’s been driving a decline for two decades. 

In the year 2000 Google made research binary. By coining the term ‘A/B test’ and birthing a culture where all tech innovation was based on it, they precipitated an era of myopia.  Until then, there were many methods. A/B testing (or ‘randomized controlled experiment,’ as I knew it) was just one of the less sophisticated ones. The A/B test was less pluralistic but quickly became the golden goose for fast-growing tech players to print a parade of new products.  Through its radical simplicity and seemingly infallible usefulness, it began to drown out other forms of research. This simplicity also drowned out the need for qualified researchers, who it pushed out of the lab. 

The result was a generation of tech bros (and a few ‘bro-ettes’) wielding simplistic data tools and no one to teach them how to design or interpret their data.  Speed and simplicity won over depth and nuance. And, as the companies who propagate this began to dominate every aspect of our lives, so did their approach to data collection and empathy. 

It’s easy to see the effect of this in today’s factious political environment. 

Many factors are blamed for today’s societal malaise. Underpinning the most oft-cited contributors of education, economy, and media (especially its newest forms) sits a more powerful puppeteer. You guessed it: data technology. I am a researcher, so I have no bones with data. Quite the opposite in fact. But, it is when a two-dimensional form of data is sanctified as the Divine truth that material problems appear. 

A/B testing is a form of quantitative data collection historically used in concert with other methods. It is our single-minded reliance on quant that makes us dumb because it doesn’t allow us to see (or even ask for) the whole picture. This same tunnel vision and absence of curiosity is driving today’s culture wars and political balkanization.  There are three reasons for this. 

  1. Firstly, an undue focus on quant drives inequality. This is because big data is expensive, giving the rich an incumbency advantage. Once you have it, it’s hard for others to unseat you, producing a data wealth gap.
  2. Secondly, an unbending obsession with ‘the number’ strips us of the intellectual wealth we once had. Every human is equipped with her own data science department that we choose to use or lose. This is our innate power of perception, judgment, and critical thinking. The more we use it the better it becomes, primarily when used in conjunction with more empirical datasets. As a research student, I would be chastised when not triangulating my data sources. So why then are we so comfortable with single-sourced data today? Is it because data just got so damned big?
  3. Humans are innately equipped to collect and process qualitative data.  Traditionally most big datasets were accompanied by the probing, intuitive qual to add context, color, and meaning. With the rise of ‘big’ (even the term became more monosyllabic) data, this thoughtful little cousin (qual) was banished from the boardroom. When you banish qual, you kick out reason, nuance, and debate. Eventually, quite existential human qualities like cooperation and empathy exit too! We can see this happening in the way social media algorithms channel us into overly simplistic bubbles and spew news at us that peddle increasingly binary narratives. But more worrying is that it’s turning our formerly nuance-reading brains into less analytical, dopamine-craving, binary-happy empty vessels. 

The moment ‘big; became the value by which we judge the usefulness of data, this binary narrative took over and forcibly evicted reason. By muzzling qual, we’re stunting the development of the next generation of critical thinkers. The daily inclusion of qual into each conversation about data used to exercise our brain’s instinct to think and reflect. Qual was the swan’s legs oscillating frenetically under the surface to maintain the grace and order above it. The tremendous cognitive work to discern meaning from much smaller data kept us sharp and sane. 

So, without the checks and balances afforded by data’s smaller, more thoughtful incarnation, society grows tetchy and stupid. A culture that respects qualitative as well as quantitative analysis is one that also celebrates critical thinking, debate, and questions. And these are all skills that we could really use right now.