The Algorithm; what it actually is, and why it matters.

Sarah Pollok
8 min readFeb 20, 2021

Whether you’ve seen the Social Dilemma several times, work in social media or simply participate in the world as it is today, chances are you are somewhat familiar with the ‘algorithm’.

As we tend to do, society has been quick to anthropomorphise this amorphous and unpredictable equation. Like a personal guardian angel of content behind each of our screens, we discuss how it ‘learns’ and ‘curates’, even going as far as to defend or villainise it. Often dismissive of the fundamental fact that few of us (including, occasionally their creators) know exactly how they work. Or even what they are.

Upon reflection, it seems somewhat ridiculous that we could be so content with knowing so little about something that dictates the majority of what we view. Perhaps because it never seems to pose that much of a threat.

Yet, as the scope of influence grows more expensive (in ways we don’t fully understand or see), it becomes increasingly important to know exactly how our media is being shaped and served, and the consequences.

WHAT THE ALGORITHM DOES ———
It may come as a surprise, but algorithms have existed millennia longer than our smartphones and laptops.

Defined as a process or set of rules followed to solve a problem, an algorithm can take the form of your favourite cookie recipe, long-form division, or as most often recognized, a content-organising system on a digital platform.

Regardless of form, algorithms exist to ‘optimise’; to minimise the cognitive effort, thought, decisions or judgements required by an individual to reach a certain outcome. For the recipe, the outcome is delicious cookies, for the digital platform, it’s interesting content. Fundamentally, a useful and noble tool.

MISGUIDED ALTRUISM OF ALGORITHMS — — —
Yet, soon after algorithms were incorporated into media platforms, an ambient sort of anxiety emerged. Something that makes a lot of sense when you realise the outcome these pieces of code are optimising for isn’t necessarily human satisfaction or fulfilment but attention.

As Cathy O’Neil, a Harvard mathematician and data scientist explains it “if a commercial enterprise builds an algorithm, to their definition of success, it’s a commercial interest. It’s usually profit.” And, as anyone in advertising will tell you, attention can be cashed in for dollars by the millisecond.

Does this mean our ‘recommended’ list isn’t crammed with engaging content? Absolutely not. However, just because something is watchable or popular, doesn’t mean it will satisfy our mercurial tastes and unique desires.

This change in the prioritisation of media, one governed by ‘watchable-ness’, comes with consequences; consumption becomes passive, culture becomes homogenized and consumers are rarely the wiser.

PASSIVE CONSUMPTION OF ENDLESS CONTENT — — —
To understand the rapid and thorough integration of algorithms into media platforms, one must first observe the environment within which content sits and how it has changed.

Today, Netflix NZ has 450 TV shows and 1,563 movies, YouTube sees 300 hours of video uploaded every minute, and Spotify boasts 50 million songs. The amount of content available (and the power/money riding on us watching it) has grown exponentially, yet the length of a day hasn’t moved a minute.

So many options (so many of which will be uninteresting or unentertaining), so little time.

Enter, the algorithm; a valiant formula offering to tirelessly carve through the malignant sea of media and unite us with content we want.

Sitting down to choose from 1,563 films on Friday night, it’s no surprise we jump at the chance to outsource the sifting and sorting to something that knows our tastes. Or at least, can take a good enough stab.
Yet, this convenience doesn’t merely enhance how we interact with media but fundamentally changes it from discerning and active to apathetic and passive.

Spotify will queue a similar song, Netflix will recommend a related show and all one must do is sit back and let it happen. If the algorithm gets it a little wrong, if you don’t vibe the proposed Instagram profile or YouTube video, you skip with the unwavering confidence that there will be always something else.

Yet, this confidence isn’t necessarily a good thing according to the renowned director, Martin Scorsese. “I’m concerned about pictures being suggested by algorithms,” Scorsese said at the 2020 International Film Festival.

“The algorithms, they tell you all these things: ‘If you like that, you might enjoy this…’ And ‘If you don’t enjoy this right away, there’s something else. And something else after that.” The director describing these endless options as a burden that turns engaged viewers into acquiescent watchers.

Rewind 40 years and there were four television channels in New Zealand. Didn’t like what was on? Suck it up or switch it off. Now, we flick forward to the next of infinite options until we find something good. Or at the least, better than the unimaginable void of nothing.

To be clear, passive consumption isn’t new. Television has long provided flickering narratives we’ve relaxed to the point of inertia in front of.

What is different is the individualised organisation and targeted presentation of what we are sat before, which has become unnervingly good at keeping us there.

Where a tacky soap opera or late-night infomercial would have previously jolted us from the couch slump stupor and off to bed, there is now always something just good enough to stave off sleep for.

Consequentially, the time spent in front of a television set is on a steady decline, while time spent on digital platforms like Youtube, Netflix, Neon and Amazon has skyrocketed.

In New Zealand, the average adult spends over 17 hours each week watching linear television, and an additional 9 hours watching online — a metric that has increased by 60% since 2014. As of 2020, 82% of Aotearoa have access to a streaming service at home.

ALGORITHM CREATES MONOCULTURE — — —
TV statistics aside, the quality of ‘good-enough-to-watch-ness’ doesn’t seem particularly nefarious. However, according to Bart Knijnenburg an assistant professor in human-centred computing at Clemson University, it’s a catalyst for monoculture, which has the potential to cause damage on an individual and societal scale.

“The goal of algorithms is to fit some of our preferences, but not necessarily all of them,” Knijnenburg said. As a result, it’s in the best interests of an algorithm to eliminate anything challenging, foreign, unusual or difficult. To instead “present a caricature of our tastes and preferences” that are vague enough to appeal en-masse.

Herein lies the problem.

Automated feeds designed for frictionless consumption can’t facilitate one of art’s foundational qualities; serendipity.

Instead, they are a master of monotony carrying us towards ubiquity and familiarity rather than diversity or difficulty. Ironically, the limitless democratised pool of media we can choose from is whittled back down to a small assortment of content that creates an overbaked monoculture; Tiger King and Bridgerton, Ozark and Emily in Paris.

The consequences of monoculture aren’t anything to laugh at either. Adam Gismondi, a director at Tufts University’s Institute for Democracy & Higher Education, has spent years studying how monocultures diminish our capacity for empathy by increasing segregation of different ideologies and reducing contact with different viewpoints. A process that, at its extreme, can undermine the fabric of a just society.

THE ISSUE WITH ‘BEING AWARE’ — — —

The grounds for resisting the algorithm and it’s influence stacks rather high. However, if one does decide to resist, they’ll soon discover this modern power can’t be fought using conventional forms of opposition.

Say ‘resistance’ and you’ll likely think of rallies and boycotts with petitions and crowds. Organised and collective action that invalidates a dominant force.

As a phrase, ‘resisting the algorithm’ may seem unnecessarily anarchic, bringing to mind tinfoil hats and scribbled picket signs. Yet, as CEO of Disney Bob Iger described it at a Wall Street Journal technology conference, it is as simple as simply not allowing “computers to decide what you want”.

However, this becomes difficult when you realise just how integrated algorithm-using platforms have become in life.

According to anthropologist James C. Scott in “Weapons of the Weak” (1985), when resistance remains within the limits of dominant power, it cannot fundamentally challenge the power.

Prime examples being those who announce a hiatus (and later, return) to social media via Instagram post or decry the toxicity of Twitter via tweet. It’s unlikely these people are completely unaware of their irony, but rather that it’s difficult to actually resist when the most effective place to share or promote said resistance is the very thing we’re denouncing.

THE REAL QUESTION — — —
It’s at this moment we realise a more pressing question than ‘how do we resist’ is ‘do we even want to?

For most of us, none of the above is news. We’ve read the headlines, heard the warnings and know the algorithms are full of flaws with seedy consequences. But, quite frankly, it doesn’t seem that bad. Immediate convenience conquers prospective predicaments and in that moment we’re just relieved there are fewer options to evaluate and decisions to make each day.

But let’s say you reach a day when you don’t want computers to, as Iger says, decide your desires. Let’s say you seek out research papers and news articles, opinion pieces and (ironically enough) social media posts for advice on how to resist. What will you find? Intelligent insights into the issue, extrapolations of the consequences and countless reiterations of the infuriatingly vague instruction to ‘be aware’.

You’ll feel irritated and rightly so. To discuss the democracy-toppling, identity-controlling consequences of a system then simply send one-off with a feeble ‘be mindful’, feels at best lazy and at worst fatal.

The house is burning, the arsonists certainly won’t put it out, but hey, simply knowing the blaze is there should make you feel better.

However, this advice makes sense when you realise it’s the more palatable of two options; moderate or abstain.

SO, HOW DO WE ACTUALLY RESIST? — — —
Since it’s unlikely anyone reading is ready to surrender algorithm-using platforms and devices altogether, and these platforms are unlikely to moderate themselves anytime soon, our remaining option is to write our own checks and balances, curb our own use.

One cannot moderate what they aren’t cognizant of, so yes, awareness must be the start. Watch films like The Social Dilemma, read critical articles and learn about the Silicon Valley founders whose morals finally caught them up and chased them from the industry. Offer your habits and behaviours up for criticism from friends and allow yourself to feel convicted.

But, knowledge without works isn’t worth much. So, then, with the knowledge that algorithms aren’t altruistic and ‘consumability’ doesn’t denote quality, we must, quite bluntly, inconvenience ourselves.

Turn off Netflix and Youtube’s autoplay and commit to actively searching for content, instead of allowing it to be passively served to you. Try a period of only watching what is recommended by humans (friends, families or critics), or reinstate the archaic yet delightful practice of film clubs where you take turns selecting a movie. Clear your history, restart your streaming platforms and don’t tell them what you like.

Watch actively, think critically and appreciate the feeling of disliking something a little off the beaten path of common culture; something a little different or difficult, foreign or peculiar. Because it means you’re not just passively watching, but actively seeing.

--

--

Sarah Pollok

just another word writing, coffee addicted millenial