The age of the personalised web is here! From life-changing hacks that remember our passwords, to annoying ads that follow us around the internet; the personalised web aims to provide you with the most suitable and relevant information, but many of these web services may be having an adverse effect on the way we interact with information.

Echo Chambers & Filter Bubbles

Filter Bubble of Opinion Not Being Challenged

Imagine there’s a discussion, taking place in the pub about a new tax on alcoholic beverages. In the pub where alcohol is served and it is assumed that its frequenters like to drink, they’d believe it shouldn’t be taxed. The pub acts as an echo chamber because this opinion will be held by the majority of individuals in the establishment. It’s unlikely that they’ll encounter opposing views in this space, instead their opinions will be ‘echoed’.

An echo chamber is an environment where a person only comes across beliefs and opinions that coincide with their own. Often they are spaces that bring people together based on shared traits and tendencies. Birds of a feather flock together and the proverb extends to digital spaces and our digital selves, but online echo chambers are much more nuanced due to the way that we come across and interact with information. Online echo chambers are known as "filter bubbles"  and they can be defined as a state of intellectual isolation caused by a vacuum of personalised data.

In this hyper-connected world, where we’re just a click away from each other, we are being increasingly pulled into like-minded forums where there is no place for dissent or a complete absence of opposing viewpoints. Just scroll down your Facebook or Twitter news feed and the effect of this "filter bubble” will be clear to you. You will see a news story from a publication that you've subscribed to because you like it. You’ll see photos and videos that you like due to who you follow and interact with. Everything will affirm your existing world views and finding content that challenges your opinions will be difficult.

Filter bubbles occur through personalisation as people tailor who and what they want to see online. This could be through following specific groups or utilising the block feature and limiting your exposure to content you dislike. That might not sound so bad but in doing this, we assume that everyone thinks like us, and we forget that other perspectives exist.

The effects of filter bubbles

Filter bubbles specifically refer to the effects of personalisation on online platforms, namely social media algorithms which determine the content we see. Echo chambers, on the other hand, refer to the social situations and groups that we belong to in the real world. Like-minded relationships and social connections themselves are not a problem, but the amount of time we now spend on these platforms, the lack of exposure to different views and the ease of propagating information mean that bad actors can use these spaces to disseminate misinformation, and it’s often believed to be true.

Prolonged use of personalisation tools can leave individuals with a narrow-minded/myopic view of the world. With the absence of opposing views, finding information to challenge opinions becomes more difficult which results in an insular understanding of the world.

Bad Actors, Filter Bubbles & the Attention Economy

Echo-Chamber-Controlling-Flock-of-People

Capitalising on filter bubbles has become pretty lucrative and, for many of these platforms, it is an integral part of their business model. Social media platforms often rely on algorithms that feed us posts and ads that they think we’ll like. In the real world companies pay for space to advertise their products in the places that their target audience frequent and this is no different online but the obscurities of the internet can be taken advantage of by bad actors with malign intentions.

Read 3 examples of how fake news makes real money

If we go back to the scenario of the pub, let’s say (a made up person called) John spends 8 hours a day in there constantly drinking, being exposed to the same adverts which promote drinking and the same group of people who encourage them to drink. He's become accustomed to the space, he enjoys the company and is content within this echo chamber. The bartenders are happy because they are receiving money, the alcohol companies are happy because they’re promoting their drinks to their target audience and are likely to get repeat orders from the pub and John is happy to indulge in his routine alcoholism, albeit to his detriment.

There are many parallels between this example of echo-chamber-profiteering and capitalising on online filter bubbles. Online, in a page on social media, the pub acts as the social media page or group—a place where like-minded people congregate. The alcohol is the biased content being shared in the group. The patrons are the members of the group that share and engage with the content. The barmen are the bad actors feeding the group hyper-partisan content for their own gain (in this case giving the alcoholic all the booze he desires). In a normal world, (we’d hope) someone would try and help the person in the pub out of their alcoholism but this is much less likely to happen online.

The more time you spend browsing in a shop, the more you're likely to spend. In recent years, online platforms have taken advantage of this; substituting the time you spend in an establishment for the time you spend in an online space—your attention. The attention economy is a term used to explain how businesses treat your attention as a resource. Much like the pub, online platforms usually have spaces to advertise products and services to you—meaning that your attention is now a commodity. In the pub, advertisers have to make a few assumptions about their audience before creating ads that get your attention, and ultimately get you to buy their product or service. Online, however, businesses can benefit by targeting specific groups of people within a digital space or filter bubble. Due to the data that these platforms collect on you, ads can be targeted with incredible accuracy based on characteristics such as your age, sex, location, relationships, hobbies and interests. The specificity of this targeting has made it easier for bad actors to create and disseminate harmful content that triggers an emotional response from those within these filter bubbles or a specific audience within them.

Further to targeted ads based on characteristics, social media algorithms notice what you interact with and send you ads according to your interactions. Not only do big corporations like Facebook collect this information to create profiles of ‘perfect consumers’ for desired products, they also monitor the length of time spent interacting with specific brands, products and offers, so that they can retarget more specific Ads to you.

Today we’re online for on average 24 hours a week and we check our phones every twelve minutes. Notifications and alerts can draw you back into these like-minded spaces in an instant, wherever you are in the world so that you are constantly exposed to belief-confirming information.

Alongside polarisation and exploitation, filter bubbles pose a threat to democracy and and can potentially encourage extremism. Thomas Jefferson wrote that a well-informed electorate is a prerequisite to democracy, but those in filter bubbles don’t receive balanced information. Democracy requires a reliance on shared facts. Instead, we’re being offered parallel but separate universes within digital spaces dedicated to the promotion of a particular belief system.  The biggest threat that such filter bubbles pose is that they can change thought processes from moderate to extremist by only feeding people a certain type of ideology. Furthermore, bad actors distributing this content feel less likely to fear repercussions due to the anonymity that the web provides.

Countering filter bubbles

Unlike many social media platforms, Logically was designed to use personalisation to counter filter bubbles and to ensure that people read contextualised information from a range of diverse sources and perspectives. The AI-powered news curation platform combines the best of machine learning with human curation to address some of the key challenges posed by the misinformation sphere. You have the option to choose news topics and entities that are of interest to you, we will then show you credible sources of information from across the ideological spectra to broaden your views, not limit them.

Exposure to diverse perspectives is vital for fostering critical thinking and making informed decisions. As tech continues to revolutionise the world we live in, there’s never been a more pivotal time for tools that combat, echo chambers, filter bubbles and the structural challenges of the digital information age.