How Sites Like Google and Facebook Put You in a Political Echo Chamber

News feeds like Facebook shape the way you see the world. The algorithms behind these sites determine which news is important enough for you. Their purpose is to increase the relevance of the stories you read, but they have an unpleasant side effect: they control the flow of information, and you end up in an echo chamber where you only hear ideas that you agree with.

On the face of it, the idea of ​​offering stories that are relevant to you makes sense. Google Now offers articles that they think you’ll love based on your search history. You can even prevent this feature from showing you stories about topics or news sources you don’t like . After all, why would you want Google to recommend junk from sites you hate? However, the more you hear the same points of view from the same sources, the more they reinforce your ideas without even challenging them . This is understandable if you don’t want to see articles about underwater basket weaving, if you don’t like it, but when it comes to important topics, getting news from only one source colors your perception and even leads to a tribal mentality in which we vilify someone. or outside the group.

Algorithmic channels encourage bias, even if they don’t want to

Facebook is the largest publisher in the world. This is the main way most people hear and read news. It also gives the social network a big impact on your perception. Earlier this year, we showed you how to find out how Facebook thinks about your political orientation . As the Wall Street Journal interactive tool has shown , the version of the story you hear depends a lot on how you lean. If you read liberal news sources (or even have predominantly liberal friends), Facebook will show you more liberal news. It’s the same with conservatives and even the most marginalized on the political spectrum. In short, this confirmation bias means that the more information you read that you agree with, the more Facebook will show you even more information that you agree with.

Of course, this is how the system works because we don’t like the idea that big companies decide what we see for us. Earlier this year, Gizmodo reported that Facebook regularly hides conservative news in its trending module (which is separate from its news feed). This led the US Senate to launch an investigation into the social media giant’s possible bias , after which Facebook removed all human curators from its news pod . He even stopped summarizing the news. Popular news topics are now identified simply by the topic of their story. If, for example, you want to know why John Oliver is on the news, you will need to click to find out from the news source. You cannot get it from Facebook itself.

Facebook is not alone. Google is doing this too. If you scroll down to Google Now on your smartphone, you will see recommendations for articles to read. They are selected based on the topics you are looking for, the articles you choose to read, and the news sources you visit frequently. In other words, Google also wants to give you more of what you’re already looking for.

While this is convenient, it definitely contributes to bias, especially your own. Worse, Google doesn’t care about the quality of the articles it shows you if they are relevant. For example, Google knows that I’m looking forward to the new season of Rick and Morty, so it regularly offers any article that might contain information on a potential release date for Season 3. Most of these articles are paraphrases of some totally shitty rumors and hoaxes that were disproved a few months ago, but that doesn’t matter. It says “Rick and Morty”, so I get recommendations. This also happens when I read political news. If I click on one story about a certain politician, Google will recommend the most popular stories about that person, whether those stories are well researched or even related. While I “showed interest,” Google thinks I should read about it.

Unfortunately, simply showing you the articles the algorithm thinks you want to see is like feeding you only the food you think tastes best. If the computer decides that you reacted positively to ice cream, you may never eat anything again, but that won’t make you healthy. Likewise, Facebook and Google are interested in showing you the stories you interact with the most. At best, these companies may try to avoid spreading knowingly fake news , but you cannot avoid encouraging your own subtle biases when the system rewards you with what you like.

What you can do to combat bias and inform yourself

Customizing your news feed to suit your interests isn’t necessarily a bad thing. There is no reason to read junk sites that pander to conspiracy theories, shamelessly lie, or throw low insults for a cheap laugh (unless you are into that sort of thing). However, you can still do a few things to avoid creating your own echo chamber:

  • Use news apps that don’t pick content for you. Algorithmic time frames are good if you just want to understand the essence of what is happening today without going too deep. For a broader perspective, follow sites like Feedly or use Twitter’s timeline. You can even keep using Facebook, sorting by most recent . Google News also collects related stories on the same topic in one place, so you can see each point of view rather than just showing the one that best suits your views.
  • Follow the news from different perspectives. The best way to better understand your world is to read stories from different perspectives. Even if a news agency tries its best not to be biased, others may pick up on what they are missing or offer a point of view that no one else has thought of. As you fill your feeds with sources you want to read from, try expanding your list to include places that challenge you or offer ideas that you are not familiar with. Challenging your own ideas is a powerful way to refine them .
  • Pause before moving on to the story to consider your own biases. When faced with a story about something crazy, pause and ask yourself, “Is it true?” Don’t talk about it just because it sounds believable. Take a few minutes to do a little research and check the story before you believe it . Learning the truth online is a lifelong skill and no one can expect to get it right all the time, but a few minutes to pause before sharing or doing something can go a long way in curbing misinformation. on the Internet .

When a tech company decides to use algorithms to adapt the world to your point of view, it distorts your reality. This does not mean that all algorithms are bad, but we need to be honest with ourselves about how our perception of world events is colored by the news we see and read. If you want to avoid falling prey to the almighty algorithm, take your research into your own hands. Or, at the very least, accept that you only get the part of the story that Google or Facebook thinks you will enjoy the most.

More…

Leave a Reply