Why are we susceptible to junk news?

How junk news preys on and attacks us

Inauthentic actors
3 min readJan 8, 2023

Social media is an instrumental part of many of our lives, it has become routine for us to check in. The reasons for using social media are extensive: keeping up with pop culture, checking updates in news, contacting friends and family and creating your own content. The grasp it has over our lives makes it a valuable tool for junk news infiltration.

© Search Engine Journal

Algorithms are fundamental in allowing junk news to spread the disinformation through. By using algorithms ‘the purveyors of disinformation have learned to exploit social media platforms to engineer content discovery’ (Bradshaw, 2019, p.2). However, the exposure of content is not solely at the fault of the apps we use, but also due to our own content preferences.

I will be exploring how social media and our personalised algorithms, go hand in hand in making junk news popular and users susceptible to being dominant readers of it.

Two theories support the idea of users circulating junk news:

  • Philip howard’s selective exposure
  • Eli Pariser’s filter bubble

Selective exposure argues that ‘we prefer to strengthen our ties to the people, information sources, and organizations we already know and like’ (Howard, 2020, p.100). In other words, we are more likely to come across information and stories that are already aligned with our interests, thoughts and beliefs. There is debate over how much of the algorithm we control versus how much is controlled by social media firms, but we cannot deny ‘the important role that individuals play in exercising their information preferences on the internet’ (Howard, 2020, p.100).

Our social media feeds are a result of a filtration system, diminishing the variety of content we see. In relation to news this means we ‘do not get a representative, balanced, or accurate selection of news and information during an election’ (Howard, 2020, p.101). This is similar to confirmation bias, arguing that ‘pre-existing beliefs and attitudes are reinforced’ by what we see (Moduli et al., 2021).

© Giphy

Eli Pariser is an author, entrepreneur and activist who introduced the theory of filter bubbles. According to Pariser (2011) a filter bubble is:

a phenomenon in which a person is exposed to ideas, people, facts, or news that adhere to or are consistent with a particular political or social ideology.

© TED Conferences

A ‘filter bubble is a personal unique universe of information that we live in’ (Pariser, 2011). The theory affirms Howard’s arguments regarding selective exposure. We are inside a bubble, where we are bombarded with messages and information that confirm our own beliefs and thoughts.

This causes problems. Being stuck within our current knowledge and having it constantly affirmed makes its harder to be receptive of other information. It also makes it hard to leave the cycle of junk news if you have been targeted by it. The filter bubble explains why we don’t often see a jump from one political stance to the other and the groups that this affects largely are the hard right conservative/republicans.

By only trusting friends and family as sources of news and information and never venturing beyond social media to be informed entraps users, making us vulnerable to junk news and allowing the firms and governments pushing the misinformation to win.

© Giphy

Bibliography

Bradshaw, S. (2019). Disinformation optimised: Gaming search engine algorithms to amplify junk news. Internet Policy Review, 8(4), 1–24. https://doi.org/10.14763/2019.4.1442

Howard, P. N. (2020). Lie machines: How to save democracy from troll armies, deceitful robots, junk news operations, and political operatives. Yale University Press. https://doi.org/10.2307/j.ctv10sm8wg.

Lum, M. (2017, Jan 27). The surprising difference between “Filter Bubble” and “Echo Chamber”. Medium. https://medium.com/@nicklum/the-surprising-difference-between-filter-bubble-and-echo-chamber-b909ef2542cc

Modgil, S., Singh, R. K., Gupta, S., & Dennehy, D. (2021). A confirmation bias view on social media induced polarisation during covid-19. Inf Syst Front. https://pubmed.ncbi.nlm.nih.gov/34840520/

Pariser, E. (2011). Beware online “filter bubbles”. [Video]. TED Conferences. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en

--

--

Inauthentic actors

Just a Communications student's perspective on the themes and issues presented in Philip Howards book Lie Machines.