Filter bubble is interesting because we all assume that everyone else is in one, except ourselves.

The basic concept is straight-forward: filter bubble is the state in which we only (or mainly) get news or information that conform with our existing beliefs and ideas.

But I don’t want you to just pick up the concept so you can use it against your communist friend in your next debate, but also to understand it.

So let’s look at how filter bubble exploits basic human nature, how it’s amplified, some criticism the theory has received, and how to avoid it.


How it works

Confirmation bias. You’ve probably heard of it. Basically, we humans love to be right. Nothing inherently wrong with that, you might even argue it can be helpful in many scenarios. But it also makes us seek supporting information (information that supports our beliefs and ideas is just more appealing and interesting to us) than confronting information.

The same principle applies to people that we interact with: we just like-minded individuals more. It’s easier to interact with them, there’s less ground for confrontation and mistrust, and, again, they support our beliefs and ideas.

Enter the internet. The internet has done many things, one of them is democratizing information, giving us a vast number of places to get news and facts from. There’s obvious positive and negative parts about that, but this is not the discussion here.

For what concerns us, this has caused humans to choose what sources of information are the ones to follow or use, because it’s impossible to keep up with all the news from all the sources. Every person will have their ways of deciding these sources: reputation, easy-access, price to pay (if any), accuracy of facts, format, entertainment, etc… But a big part will also be how much that source confirms your current beliefs.

I don’t think that we’re fully aware of how we take these decisions, I believe we just build our information network over time without thinking too much about all the elements of it. Yes, sometimes we might decide for a certain source on a rational basis, but in the end it’s mostly sources that generally agree with us.

Another element that has come into play is the algorithms social networks use to provide us with content (algorithms on the internet is another topic that we’ll leave for another day, as there are many things to unfold there too). In interviews with the creators of these algorithms, it becomes evident that there was no ill-intention by the creators: they just wanted to provide a system that would feed us the type of information we like – or the algorithms thinks we like.

But, as we know, the type of information we like is mostly the one that confirms our belief systems. So, the algorithms just provide us with more and more of the same things – even slowly but surely feeding us a bit more extreme bits.

So, current technology has made us even more susceptible of confirmation bias. Each person has their own bubble of information they live in.


Overlooked danger

There are some obvious dangers for democracy and just liberal views in general here. Evidently, each person living in their own filter bubble can not be good for a society in which we’re supposed to interact with each other. It promotes and creates antagonistic ideology groups, hinders open dialogue, and is often responsible for the popularization of conspiracy theories or concepts like ‘safe spaces’.

Actually, safe spaces seem like one of the more deliberate attempts from a group of people to create a filter bubble.

But an often overlooked danger, because it affects the individual rather than society, is losing the ability to think for yourself. And it should be motivation enough to reconsider one’s own information network.

At the start you might choose sources of information that suit what you already believe, but as new information enters your realm of awareness, this one gets more and more distorted by that filter that you created for yourself.

It’s groupthink: you start to identify with a group, and before you know it you give the group the power to think for you.

You chose them because they think like you, so you must, by calculation, think like them.

One good, and not such a politically charged, example is football: actions are obvious fouls when they are against your team, and obvious fair play if one player of your team does it. Your team being in debt was a mistake from previous owners and not such a big deal, but unacceptable if the enemy team is in the same situation. And hey, referees are just humans if they make a mistake that benefits your team, but it’s an obvious conspiracy by the league to benefit the other team if it happens to them.

Yes, you laugh at football fans, but someone impartial in politics and social issues would probably laugh at you the same way if he saw your behaviors, ideas, and beliefs when it comes to these topics.


Critics

Filter Bubble has received some criticism as well. There’s one article I found very well researched and who makes some strong points, done by Peter Dahlgren1.

Most of the problems he has with the theory can be traced back to the fact that the term was popularized to give a clear account of the issues that modern democracies are facing.

He’s right, filter bubbles don’t explain everything, and some points that were made by Eli Pariser (the person that popularized the concept) are just wrong. Here are, in summary, the issues he sees with the theory:

1. Filter bubbles can be seen at two levels: technological and societal

2. People often seek supporting information, but seldom avoid challenging information

3. A digital choice does not necessarily reveal an individual’s true preference

4. People prefer like-minded individuals, but interact with many others too

5. Politics is only a small part of people’s lives

6. Different media can fulfill different needs

7. The United States is an outlier in the world

8. Democracy does not require regular input from everyone

9. It is not clear what a filter bubble is

His main points are that even though from a technological point of view you might be able to defend the filter bubble, on a societal level it’s more difficult. Even more difficult is to prove that it actually causes the issues that Pariser says it causes.

Personally, I do see some theoretical issues Dahlgren doesn’t address right.

For example, it is true that filter bubbles don’t avoid us seeing controversial information that goes against our set of belief systems; people are confronted with it, even look for such information deliberately. The problem I see is that the type of consumption of this information is very different: when you consume information contrary to your belief system, you do so critically, looking at the flaws and with a filter of ‘that is not true’. On the contrary, when you review information that confirms what you already believe, you’re way more likely to accept the information as correct. People think less critically.

The same goes when we interact with people that we know think similar to us, and with those that don’t.

But that’s not really that relevant.

I believe the main issue is that it’s nearly impossible to prove. Filter bubble is one piece of a great mechanism, and I haven’t seen people denying this. The debate revolves around how big the piece is, and how important.

Pariser is no doubt a smart person, made some great observations, and has done a very good job in giving awareness to an issue that is there, and needs to be spoken about. But Pariser had to also, overplay the causes and implications it has, in order to give it visibility.

A good example of the complexity of the issue is the point made about polarization. It seems logical that the current algorithms lead to political polarization, specially if you have looked at both online and offline behavior in people over the last years. But how can you accurately prove it? How can you accurately disprove this? How can you calculate the actual correlation?

I see it the same as the debate about behavioral gender differences: how much is biological, and how much is social nurture? We don’t know.

Filter Bubble doesn’t explain everything, and there are some flaws to be addressed. Extraordinary claims require extraordinary evidence – and some claims made using the filter bubble argument are extraordinary.

But that doesn’t change the fact that confirmation bias and new technology have found a way to exploit each other, and we are all victims. Call it filter bubble, selective exposure or “you’re so brainwashed…” . But it’s there, and, at least, we should act on it on a personal level.

Let’s look at some ways to do that.


How to avoid it

Lets face it; you won’t be able to avoid it completetly. But you might get far with some easy things:

  • Browse for news in incognito mode, delete cookies, and avoid your accounts to get in contact with news sources.
  • Switch your focus from entertainment to education. Be honest with yourself: when you’re browsing for news, you’re mainly looking for something to do. Only look for information you’re really curious about and want to learn: most news are not newsworthy anyway.
  • Consciously choose your news sources, it’s probably not the ones you’ve been using as a default. Think about what topics are of your interest, and then look for rigorous sources that focus on those topic. Don’t let the media decide on what news you will focus on.
  • Delete Twitter (calling it X will be challenging). Delete Instagram, Facebook. Or at least, use it for entertainment only: social media create the biggest filter bubbles I’ve seen so far, and pose the biggest thread. Use them for nothing serious: look at cat pictures, chess videos, or to congratulate your forgotten friends for their birthday.
  • Watch out for YouTube: one thing is to occasionally search for some explanatory videos, but you can quickly get drawn into a big hole of shit.

So?

So, the concept of filter bubble is not waterproof, but worth knowing and looking out for: you’re probably in one, I’m probably in one, and I’m sure you see other people being in one.

The internet is a great place, with many positive things, but not without it’s issues. You have to use it consciously and be aware of the many dangers. Filter bubble is one of them.

  1. Dahlgren, Peter. (2021). A critical review of filter bubbles and a comparison with selective exposure. Nordicom Review. 42. 15-33. 10.2478/nor-2021-0002. ↩︎

Leave a comment

Trending