2020, the age of information, disinformation and sometimes no information. We are constantly battered with breaking news from every device we own, even our watches. Sometimes you read it, sometimes you don’t – it all depends on how important that news was to you. Which leads me to our problem. Echo chambers.
As you click, swipe and scroll away from certain things on the web, you are being tracked. Not necessarily for bad reasons, usually for simple advertising, but nonetheless you are contributing to a ‘map’ of yourself that contains a history of your tendencies to click, swipe or scroll away from certain things. Tech giants use this information to show you, more often, the things they think that you will most likely click, to keep you on their platform as long as possible.
The logic is simple, they know what you like, so they show you more of it. The problem however is not so simple.
When someone is shown the same information over and over again they will likely start to believe it, especially if it’s coming from multiple sources. This problem manifests itself impressively well on the internet where any website or profile can easily appear to have high authority and reputation.
The best example I can explain for this is Facebook Groups and Pages. Facebook knows exactly what type of person you are by the pages you ‘like’, the groups you join and the posts you interact with. Scrolling through your feed you may see posts from people around the world who are also in these groups. You might add one of them as a friend because you had a great conversation and wildly agreed with each other, and now you have an extra person in your Friends list who agrees with you. You might enjoy a certain page’s posts so you re-share them often, which increases how often you see that page in your feed. Whatever the case may be, how you act on Facebook directly affects what you see on Facebook.
This is how an ‘echo chamber’ works. An idea is presented to a closed circuit of people that already agree with the newly presented idea – thus creating a stronger closed circuit. When an external idea is presented it is deflected before it is even allowed on the floor for debate. Again, strengthening the closed system. This can be dangerous if the closed-circuit often shares incorrect information or denies true information that doesn’t necessarily make them happy. Check out this infographic for a helpful visual.
How do you avoid this?
You know that feeling when you see something online and all of the sudden you are filled with rage? So you whip open Google and start debunking it from every angle. Do the exact same thing when you see something that makes you happy. Emotions are blinding.
- Start vetting everything you interact with online as if you had to pay for every share, like or follow
- Check multiple trusted, legitimate sources when you are unsure about something
- Before you believe anything, see if it’s been peer-reviewed
- Don’t get your news from memes
Be careful what you do online. The groups you join, pages and posts you like, the people you follow – everything gets blended down together into one easily scrollable feed that constantly influences your thoughts and feelings about the world.
As Professor X puts it: “Don’t let it control you”