Posted on Wed 13 September 2017

With opinions coming from all sides on social media, how do we make sure that we aren't succumbing to human nature and seeking sources that we agree with?

Table of Contents

  1. Human Nature
  2. Recognizing An Echo Chamber
  3. FaceBook
  4. Reddit
  5. Twitter
  6. How to Escape

Human Nature

To preface this, I will say that I don't have a psychology degree, social science degree or any formal education of any relevance. I do however, like many youth of my generation, have extensive experience with the current suite of social media platforms such as FaceBook, Twitter, Reddit, Tumblr, Snapchat, etc.

Having just read a few articles (1,[] 2, 3) on confirmation bias and the like, it occurred to me that even if people are able to recognize an echo chamber, they may not know how to escape it, or why they should want to.

Some of the relevant cognitive biases which apply here include:

  1. Confirmation Bias
  2. Belief Bias
  3. In-Group Bias
  4. Framing Effect
  5. Anchoring
  6. Dunning-Kruger Effect

I won't go into detail about each of these (mainly because I don't understand them enough to give a proper explanation) and remember that this list is not exhaustive, there can be many other factors in digital consumption.

Recognizing An Echo Chamber

Sometimes it can be difficult to define when a medium transforms from a balanced culture to an echo chamber, particularly if it becomes biased in a way that aligns with your own opinions. This is because everyone enjoys having confirmation in their beliefs. It is easier to accept a falsehood than to admit a personal misconception. That's why it is important to strive for self improvement, especially when it is difficult.


Facebook is an interesting case, since your experience is fundamentally based on your immediate social bubble. Facebook started out only at Harvard, but quickly encompassed most colleges and universities. The way a friends list is built up was entirely based upon where you went to school, which is a very particular bubble. Unfortunately it is difficult expand your friends list with contrarian views on purpose. There is no doubt that Facebook employs algorithms to maximize showing you content that you will click on. Whether this comes in the form of pictures, articles, or quizzes, it is important to try your best to click on those links from friends that you do not necessarily see eye to eye with. Unfortunately this does require these posts are from valid sources.


NOTE: Just reading headlines is an issue that permeates the site influencing what posts are most viewed, often described as clickbait.

This is the platform that I am most familiar with, and has some of the most entrenched bubbles. With an archaic voting system, powerful self-appointed moderators, and a formerly heavily tech influenced audience, it is no wonder that many subreddits quickly turn into echo-chambers. Where some subreddits claim to be free from bias, the voting system, while not intended to hide dissenting opinions, often works in favour of what is commonly labelled 'the hive mind'. If you've spent any time on Reddit, then you know that the user base is quite disparate at times ranging from the average (/r/hockey) to ugly (/r/incels). Each of these subreddits have quite strict rules which are enforced by moderators. The problem here is that where one is a general purpose gathering of people with a similar interest, the other is an echo chamber where dissenting opinions are silenced. With the subscription system, it is easy to login to an account on reddit and fill up your frontpage with subreddits of ideas that you agree with.


An interesting study showed the lack of crossover between political ideologies in the Twitterverse. Not only were people of differing opinions not interacting, they were not even aware of each other for the most part.

How to Escape

Large corporations have gained incredible abilities to influence your web experience, particularly with social media. Almost every site that you visit will collect any data from you that they can whether it is info on the links you click, or the site that directed you there. With a history on it's users, a site can tailor information to better influence your behaviour.

The best medicine for an echo chamber is knowledge. Understanding that you are only receiving half of any given story is an important step. The second step of actively choosing to listen to ideas that conflict with your worldview is much more difficult. In some ways it seems similar to the well known stages of grief:

  1. Denial - Disregarding the possibility that your worldview might have inconsistencies or be invalid in any way.
  2. Anger - Lashing out at anyone who proposes ideas that differ to yours.
  3. Bargaining - Commonly found as whataboutism.

Obviously these are just similarities, and don't necessarily reflect any type of process for dealing with living in an echo chamber.