Introduction
Algorithms lurk beneath the surface of an individual’s every search and click on websites like Google and Facebook, gathering a user’s data to display information the company believes the user wants to see. These algorithms are portrayed as helpful to the user tailoring their experience in a personalized manner so that they may obtain the information that is correct for them (Google’s Search Algorithm and Ranking System – Google Search, n.d.). However, there is a more nefarious side effect of collecting data, that side effect is called a filter bubble. A filter bubble is an abstract concept to explain an online lens an individual may see through when an algorithm takes their data and only provides them with information representative of that data (Techopedia, n.d.). Filter bubbles are nefarious as their application sculpts a user’s point of view by continuously reinforcing their original viewpoint, blocking them from gaining a deep understanding and doing so without them realizing that it is happening. On the other hand, if the user is getting the information they want, then why does it matter to know all sides? While asking for and being presented with data the user wants to see may seem efficient and ideal in concept, the information the user cannot see deters them from knowing all sides of the coin restricting the user from obtaining a truly objective answer. To gain deeper insight, a user must be shown a breadth of answers. The solution to getting around filter bubbles and accessing that breadth of answers starts with understanding what effects they may have on the user. Such effects may hinder a person’s ability to profoundly question and obtain a well-rounded answer, due to filter bubbles and their propensity toward echo chambers, and polarizing viewpoints.
Echo chambers
Due to filter bubbles, a person’s ideologies tend to isolation (Berman and Katona, 2020). This isolation propels the person to seek information that reinforces their initial viewpoint, creating echo chambers. When an individual has an isolated view on a subject and they are searching for answers, they may gravitate towards information that instils confirmation bias on their point of view. Research has shown that people do not prefer to use opposing information to alter their views but instead will use the information they find that confirms their original viewpoint to double down on what they already believe (Kappes et al., 2020). Even if a user specifically searches for an opposing point of view, regardless of the information brought forth to them through filter bubbles, the user will tend to agree with the information that pertains to their isolated viewpoint. This propulsion towards their initial point of view builds an echo chamber from other users that have similar beliefs, thus confirming their own beliefs.
Echoing beliefs of a user’s viewpoints is encouraged and projected by social media groups that have aligning viewpoints of the user. Groups on social media platforms like Facebook are formed by like-minded people who join to have their views confirmed (Vicario et al., 2016a). When social media platforms allow the creation of groups coupled with algorithms that perpetuate confirmation bias, the ability to find a space that projects and confirms a user’s biased views allow the user to exist in an echo chamber.
An individual that joins a social media group that has aligning viewpoints ends up reverberating and strengthening their own viewpoints within that group. Vicario et al. (2016b) finds that when communities create homogeneous clusters (Facebook groups for instance) with like-minded people, these clusters become the primary driver of information isolation. Furthermore, Dandekar et al. (2013) devised a model that shows homophilous networks paired with an individual’s biased assimilation results in a more extreme view of information pertaining to their initial viewpoint. Therefore, the reverberation of the individual’s pre-existing reinforced views within a like-minded group on social media acts to create echo chambers that perpetuate information isolation.
Polarizing viewpoints
The by-product of echo chambers is that they tend to deepen polarizing viewpoints. Filter bubbles hinder users from expanding their viewpoints past their own, and this lack of expansive awareness generates polarizing societal views (Min et al., 2019). It is understood that echo chambers emerge from confirmation bias, but when a website’s filter bubble inducing algorithm contains social filtering mechanisms, these mechanisms have been shown to strengthen social polarization and disjoint echo chambers further (Geschke et al., 2019). Filter bubbles exacerbate social polarization as algorithms with social filtering propel the user towards echo chambers that provide confirmation bias.
Of the many infinite issues that may be addressed online, viewpoints on political topics are often inherently polarizing in nature and on the issue of Brexit, social media echo chambers increased social polarization. A study and analysis done on over 1 million Facebook users that searched for Brexit information in 2016 revealed their search results influenced polarization on the perception of Brexit (Vicario et al., 2017). The echo chambers Facebook created with their filter bubble inducing algorithms, socially and politically polarized the matter of Brexit by filtering search results and only showing results that were congruent to a user’s initial viewpoint.
Not only did filter bubbles induce social and political polarization on the issue of Brexit, but filter bubbles also induced social and political polarization during the 2016 U.S. presidential election. Analytical testing done by Guo et al. (2020) on over 50 million tweets from Twitter suggested social and political polarization of the presidential candidates Hilary Clinton and Donald Trump. Like Facebook, Twitter played a part in the social polarization of what information was presented to the user based on the sites’ respective algorithms. More specifically, socially polarized information pertaining to Hilary Clinton and Donald Trump during the 2016 U.S. presidential election. In combination, polarization and the creation of homogeneous communities or echo chambers were further exacerbated by filter bubbles during the 2016 U.S. presidential election.
Conclusion
When an individual uses social media and search engine platforms to try to access true information, how that individual receives that information and what part of the truth they receive depends on the website’s algorithms and the data gathered about the user. This selective and filtered process creates the phenomenon known as a filter bubble and while seemingly harmless and at times thought of as efficient, the filter bubble has a more nefarious side. This nefarious side being, algorithm induced filter bubbles perpetuate echo chambers, and polarizing viewpoints. These by-products that filter bubbles perpetuate are significant because they all have the ability to sculpt the mind of the user and propel them toward a singular, isolated ideological viewpoint without them even realizing it’s happening. Homogenous clusters called echo chambers allow a singular viewpoint of an issue to be chorused, thus strengthening this viewpoint through confirmation bias. These self-serving echo chambers lead to further polarization on society and their points of view regarding all matters, but particularly social polarization that is political in nature. The social polarization echo chambers cause in social media platforms such as Facebook and Twitter divides and reinforces a user towards their initial viewpoint, giving them information filtered specifically for them through their algorithms social filtering mechanisms. One by-product of filter bubbles begets the other and the sum of the whole stops an individual from being able to ask thoughtful questions and receive well-rounded, deep answers to those questions.
References
Berman, R., & Katona, Z. (2020). Curation Algorithms and Filter Bubbles in Social Networks. Marketing Science, 39(2), 296–316. https://doi.org/10.1287/mksc.2019.1208
Dandekar, P., Goel, A., & Lee, D. T. (2013). Biased assimilation, homophily, and the dynamics of polarization. Proceedings of the National Academy of Sciences, 110(15), 5791–5796. https://doi.org/10.1073/pnas.1217220110
Dwoskin, E., Stanley-Becker, I., & Kelly, H. (2020, November 4). Trump’s early victory declarations test tech giants’ mettle in policing threats to the election. Washington Post. https://www.washingtonpost.com/technology/2020/11/03/misinformation-election-social-text/
Geschke, D., Lorenz, J., & Holtz, P. (2019). The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. British Journal of Social Psychology, 58(1), 129–149. https://doi.org/10.1111/bjso.12286
Google’s Search Algorithm and Ranking System—Google Search. (n.d.). Retrieved March 26, 2021, from https://www.google.com/search/howsearchworks/algorithms/
Guo, L., Rohde, J. A., & Wu, H. D. (2020). Who is responsible for Twitter’s echo chamber problem? Evidence from 2016 U.S. election networks. Information, Communication & Society, 23(2), 234–251. https://doi.org/10.1080/1369118X.2018.1499793
Kappes, A., Harvey, A. H., Lohrenz, T., Montague, P. R., & Sharot, T. (2020). Confirmation bias in the utilization of others’ opinion strength. Nature Neuroscience, 23(1), 130–137. https://doi.org/10.1038/s41593-019-0549-2
Min, Y., Jiang, T., Jin, C., Li, Q., & Jin, X. (2019). Endogenetic structure of filter bubble in social networks. Royal Society Open Science, 6(11), 190868. https://doi.org/10.1098/rsos.190868
Techopedia. (n.d.). What is a Filter Bubble? – Definition from Techopedia. Techopedia.Com. Retrieved February 22, 2021, from http://www.techopedia.com/definition/28556/filter-bubble
Vicario, M. D., Vivaldo, G., Bessi, A., Zollo, F., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2016a). Echo Chambers: Emotional Contagion and Group Polarization on Facebook. Scientific Reports (Nature Publisher Group), 6, 37825. http://dx.doi.org.proxy.lib.sfu.ca/10.1038/srep37825
Vicario, M. D., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016b). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554–559. https://doi.org/10.1073/pnas.1517441113
Vicario, M. D., Zollo, F., Caldarelli, G., Scala, A., & Quattrociocchi, W. (2017). Mapping social dynamics on Facebook: The Brexit debate. Social Networks, 50, 6–16. https://doi.org/10.1016/j.socnet.2017.02.002