Internet algorithms are the reason I regularly see writing advice, inspiring art, and incredibly cute and hilarious animals in my Instagram search tool. When you shop online, they’re the reason you might see ads or product suggestions that appeal to your interests or hobbies. They’re the reason why my YouTube feed filled with mukbang videos after my wife used my phone just once to watch a few… and now I’m haunted forever.
Using our digital histories — our likes and our follows — algorithms make it easy to get to what we want, when we want it.
There’s a danger inherent in that ease, though. I previously wrote an article for the fullest about the power of Internet algorithms to shrink our perceptions and understanding of the world, and how emerging technologies like augmented reality have the potential for disastrous reality-construction.
Now, rather than discuss how this might be theoretically dangerous, let’s explore a case study in how it is actually dangerous…
A Pew Research Center study of 25 years released in 2017 shows a shift in ideological consistency in both politics and the public from a more centrist majority to two majorities that are more consistently liberal or conservative in their reported values and political beliefs.
People used to be more flexible in their values, able to cross political lines more easily in opinion and agenda. Now, that line is bold and wound with barbed wire.
As this partisan divide over political values has widened, antipathy towards the opposing party has doubled, from 20% of both parties in 1994 expressing very unfavorable opinions of the other, to 44% of Democrats and Democratic leaners and 45% of Republican and Republican leaners respectively by 2017. A 2015 study by professors Jonathan Haidt and Sam Abrams also indicated that the degree of ideological purity within both parties had doubled in the last 40 years, and that a result of several trends is an increased mutual animus across party lines.
Though Haidt and Abrams indicate several reasons for this over time, what we’re focused on are the most recent changes in the media environment. The advent of cable television and the Internet saw a handful of news sources before the 1980’s exploded into hundreds of partisan news sources. Broadcasting, appealing to a wide range of viewers, has been replaced by narrowcasting, or targeting more specific audiences. Writer Tim Urban of the culture and science blog WaitButWhy explains that narrowcasting motivates news agencies to appeal to particular audiences — say Democratic or Republican — and to do so by adjusting their objectivity and accuracy.
In short, news agencies will leave out particular facts — or completely fabricate some — to appeal to the pre-existing beliefs and values of their target audience.
They do this while maintaining, through their marketing or branding, that they are sources of honest, factual information. Urban suggests that trust is society’s most precious resource, and that trust is manipulated by inaccurate and biased reporting, resulting in a spread in the perception of animus between populations.
So why would we fall for such a thing? All of us, to varying degrees, have a shared weakness: confirmation bias — a desire to seek out information that verifies our own viewpoints, beliefs, and understandings of the world. Confirmation is a warm, fuzzy blanket that reminds us that we know what’s going on, that we’re in control, that we’re right and someone else is wrong. It’s comfort, one that we have more access to than ever before because of handy Internet algorithms whose long arms gingerly tuck us in every night.
Enter the filter bubble, a contemporary phenomenon in which we become stuck in our own ideologies and perspectives due to these algorithms. If we, the audience, get our information through a filter that ensures we only get stuff we like, guess what our perception of reality is? Like those news sources, our respective realities become inaccurate, missing all the facts, made to please and comfort rather than inform, challenge, and educate.
Finally, we approach a situation in which it becomes easier to demonize the “other,” which more and more of our news coverage and political heads have been doing. Just take a look at the tweets from political representatives on both sides and the word “disgusting” gets tossed around a lot in regards to the “other side.” A paper by psychologists Mark Schaller and Lesley Duncan suggests that disgust is a basic human emotion meant to protect us from disease as part of our behavior immune system, but one which we also apply to people. In this way, disgust has been linked to moral judgement, dehumanizing those it’s used against.
Nazis did this when they referred to Jews as rats, swine, and insects. Radio broadcasts before the Rwandan genocide referred to Tutsis as cockroaches. Psychologically, it’s far easier to hate and harm an enemy when they seem less than human.
Humanity has used disgust throughout history to rationalize lots of bad things, and we should be concerned with how our filter bubbles and politicians participate in the spread of disgust. How many speeches, articles, or tweets have we experienced that talk about those dirty, dangerous, disgusting Democrats or Republicans? Is it any coincidence that extremes on both sides have become more vocal and present? Have you ever heard yourself or a friend ask something like “But how can group x believe that or follow person y?”
Have you taken a moment to peruse your filter bubble to see if you encounter anything positive about the other side?
In this attempt at a more perfect nation, I’d like to think that filtration is just a trend that we’ll eventually even out. But until then, we’re at least comforted by some good news: according to The Hidden Tribes of America, a yearlong study in 2018 to understand polarization and tribalism by the nonprofit More in Common, found that the polarized split is more a perception than a reality, as 77% of their study fit into an Exhausted Majority who are fed up with polarization and believe our country can come together to solve its problems.
And perhaps we can. We just need to remember to pop those filter bubbles.