Explaining the phenomena: ‘filter bubbles’ and ‘fake news’

Although the internet can be a fountain of knowledge for you to educate yourselves on past and present news events, it can have detrimental impacts due to the freedom that people have to create fake news which can infect your personalised algorithms (filter bubbles). Pariser (2011), an internet activist, coined the term ‘filter bubble’ as an explanation of why and how a user’s ability to view certain information is based on their previous internet usage. The more that you, an individual user of digital media, involve yourself with certain ideologies or pieces of potentially fake news, there will be an increase in similar pieces of fake news that you will be shown due to the algorithm you have created.

Filter bubbles appear in digital communication due to personalised algorithms which limit a user’s access to information based on past behaviour (Pariser 2011). Algorithms, “a finite list of precise steps for executing a particular task” (Jones and Hafner 2012), originate from maths – computers are based on these algorithms. Search engine algorithms rank websites you have viewed based on networked associations. Networked associations follow a bottom-up classification system, meaning that these algorithms are organised into a hierarchical structure based on numerous characteristics. Such as, grouping websites on their  potentially close-knit relationship of topics. These personalised algorithms allow the user to access specific pieces of information related to their previous searches. Another data classification system (way that algorithms are organised) is hierarchical taxonomies. Hierarchical taxonomies represent a top-down classification system (Jones and Hafner 2012), an example of this in the real world is the Dewey Decimal System in libraries to organise books – the choices to classify this way can reflect an ideology, agenda, or even bias. As discussed in Seargeant and Tagg’s (2016) blog post, algorithms should be easy to get rid of but the idea ‘that algorithms are responsible for filter bubbles’ is a blind-sided explanation. It basically ignores the fact that people make their filter bubbles by pulling away from ‘political discussions and hiding opinions they disagree with’.

The other phenomenon present in digital communication, fake news, is best described as ‘fabricated information’ that is designed to deliberately misinform people with false information (Lazer et al. 2018). More often than not, the distribution of fake news benefits ‘specific social actors’ and could be labelled as ‘propaganda’ (Yates 2016). Fake news is more prevalent in digital communication now due to reduced gatekeeping: the internet has become much more of a participatory sport, if you will, people can create and generate their own content (user-generated content) in order to influence people to buy certain products for example. Their ability to influence people with their own content can be manipulated and turned into feeding fake news into people’s timelines and misinforming them! Another explanation of why fake news is so widespread in digital communication is because of the spreadability and scalability of social media. Social media enables its users to share content across other platforms, or even the same platform, this helps to spread fake news as one user could share a misinformative piece of ‘news’ and in turn spreading it further to affect more users. Fake news is a massive issue as it ruins people’s trust in all news and also leads to disengagement. So, how can you check if a piece of news you are viewing is legitimate? Follow these easy steps:

1. Check the source

2. Check the author

3. Check the facts

4. Follow up the references and potential hyperlinks

5. Check who else is citing the source

There is never going to be a time where everyone has the same beliefs and ideologies, so fake news and filter bubbles will be a problem for many years to come. Filter bubbles can also be seen as social algorithms because of the people you surround yourself with who share similar ideologies to you. But ‘friends’ on social media come from different parts of your life: family, work colleagues, old friends. These ‘friends’ can have different views on things – thus creating intradiversity in what you see online (Seargeant and Tagg, in press). Research on fake news on Twitter has proven that fake news (specifically about politics) is retweeted by more users, making it spread exceptionally quickly (Vosoughi et al. 2018; cited in Zhou et al 2020).

            Overall, the two phenomena (filter bubbles and fake news) appear in digital communication due to a multitude of different reasons – algorithms, networked associations, and user-generated content. The technological and social algorithms involved in spreading filter bubbles, and thus fake news, make it near-enough impossible to avoid any form of misinformation caused by fake news. All we can do as a society is accept this and to try and fact-check before taking any form of news as gospel.

Word count: 791

References

Jones, R. & Hafner, C. 2012. Understanding Digital Literacies: A Practical Introduction. London and New York: Routledge.

Lazer, D.M., Baum, M.A., Benkler, Y., Berinsky, A.J., Greenhill, K.M., Menczer, F., Metzger, M.J., Nyhan, B., Pennycook, G., Rothschild, D. and Schudson, M., 2018. The science of fake news. Science359(6380), pp.1094-1096.

Mayrhofer, M., Matthes, J., Einwiller, S. and Naderer, B., 2020. User generated content presenting brands on social media increases young adults’ purchase intention. International Journal of Advertising39(1), pp.166-186.

Pariser, E. 2011. The Filter Bubble. What is the internet hiding from you. New York: Penguin Press.

Satija, M.P., 2013. The theory and practice of the Dewey decimal classification system. Elsevier.

Seargeant, P. & Tagg, C. 2016. The filter bubble isn’t just Facebook’s fault – it’s yours. The Conversation 5 December. Available at: https://theconversation.com/the-filter-bubble-isnt-just-facebooks-fault-its-yours-69664 [Accessed: 15 March 2024].

Swart, J., 2021. Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social media+ society7(2), p.20563051211008828.

Twitter (2006). Twitter. It’s what’s happening. [online] Twitter.com. Available at: https://twitter.com/?lang=en-gb. [Accessed: 15 March 2024]

Yates, S. 2016. ‘Fake news’ – why people believe it and what can be done to counter it. The Conversation 13 December. Available at: https://theconversation.com/fake-news-why-people-believe-it-and-what-can-be-done-to-counter-it-70013 [Accessed: 15 March 2024].

 

Comments