Trapped in the Facebook Bubble
Over the past few months there has been a heightened conversation among internet users on the inundation of fake news sites and the harm it causes the education of our people. Wether it is a falsified prediction of U.S. voter turnout or a sensationalized coverage of an entertainment event, these fake news sources skew the way people view the world and misinform populations of people who have been taught to accept information without question. In light of the recent United States Election, the prominence of fake news has come to the forefront. When these fake sources are circulated online, specifically on social media platforms like Facebook, we run the risk of falling into an echo chamber of false information and filtered truths.
As online contributors, it is crucial to be aware of the sources we use and of the content we reference, especially in times of social and political turmoil. Since the new and popularly criticized president of the United States was elected, talk around the digital mistakes we have and continue to make has risen. Before this tumultuous election, terms like “echo chamber”, “filter bubble”, and “fake news” were really only known and popular amongst an intellectual crowd. Since President Trump has set foot in office, however, these terms have infiltrated many internet infrastructures.
In 2011, Eli Pariser went to TED to share his online concerns in a talk titled “Beware online “filter bubbles””. In this talk Eli touched on the lack of control individuals hold in the personalization of their filtered feed of information and how detrimental this is to both the knowledge and awareness of the individual and democracy itself. In describing the way platforms like Facebook and Google personalize content to show us what they think we would like to see rather than what we need to see, he used the analogy of a balanced diet. Pariser stated that in an ideal world we would have an equal balance between enjoyment and pleasure based content and political or crucial content for the health of democracy, or in his words “some information vegetables […] some information dessert.” (5:30-5:35) What happens when online algorithms base the information we see on the information we like, or click on, is we are essentially surrounded by “information junk food.” (5:50-6:00) This metaphor not only emphasizes the toxicity of filter bubbles, but perfectly captures the appearance of fake and sensationalized news.
Disguised as a healthy meal, and coated with a self-satisfying sensation, this information junk food is displayed all over the internet, and its click-bate design makes it nearly impossible to avoid indulging. Our use of computers and the internet for both work and play subconsciously blurs the lines between professional and recreational communication, making fake news harder to spot. (Frank) Author Russell Frank explains in “Caveat Lector: Fake News as Folklore” that he himself has fallen victim to fake news. After reading multiple articles that appeared to be reliable and were written in a journalistic style, he discovered that in fact all of the sources were falsified. (Frank) These nearly indistinguishable fake news stories slip into voters feeds and provide them with exaggerated and misstated information. A reporter from Buzzfeed, Craig Silverman, explained that fake election news was shared, commented on and reacted to 20% more than real election news in the months leading to the 2016 U.S. presidential election. (Berghel)
Once these fake news stories enter the realm of voters and citizens internet sphere, the likeliness of them circulating is very high. John Bohannon, writer of “Is Facebook keeping you in a political bubble?”, however, states that although the bubble exists, they do not hold the same weight as some may believe. When hosting a case study with over 10 million Americans from varying political beliefs, researchers found that Facebook’s algorithm made it only 1% less likely for stories to cross over into both conservative and liberal Facebook profiles. (Bohannon) After concluding the study, Bohannon explained that regardless of the likeliness of crossover between political viewpoints, these bubbles are still no matter to be taken lightly.
In his 2012 article “‘Social Voting’ Really Does Rock the Vote” he explained the reality of a Facebook herding bias. In 2012 as an attempt to increase voter turnout, Facebook rolled out a prompt on voting day allowing users to click an “I voted” button while displaying photos of six friends who had already voted. For Facebook users who’s friends had already clicked the “I voted” button, there was a 0.39% increase of likeliness for those users to vote. (Bohannon, “‘Social Voting'”) These Facebook statistics were then compared to state voter results, and the percentage held its truth. Although this case study does not reflect exactly on the topic of fake news, it exemplifies the notion that events that take place online have real world translations and effects.
As an online contributor, I feel it is my responsibility to be careful with the content I post and share. Like many others, I have fallen victim to believing fake news, and have even shared it. As someone who has grown up with the internet and social media, my immediate instinct is to trust the platforms on which I operate. In light of recent events, scandals, and with an increase of education, I have made an active decision to question all sources I come into contact with and to think of those who I may influence. It is easy to think that my sharing a fake news story on Facebook will have little to no impact on the world around me. What is challenging is accepting that what happens online extends into real life, and that the filter bubble we see on Facebook has more control over us than we may believe.
Berghel, Hal. “Lies, Damn Lies, and Fake News.” Computer 50.2 (2017): 80-85. Ieee Xplore. Web. 25 Feb. 2017.
Beware Online “filter Bubbles”. Perf. Eli Pariser. TED. N.p., Mar. 2011. Web. 24 Feb. 2017.
Bohannon, John. “Is Facebook Keeping You in a Political Bubble?” Science (2015): n. pag. Web. 25 Feb. 2017.
Bohannon, John. “‘Social Voting’ Really Does Rock the Vote.” Wired. Conde Nast, 13 Sept. 2012. Web. 25 Feb. 2017.
Frank, R. “Caveat Lector: Fake News as Folklore.” Journal of American Folklore, vol. 128 no. 509, 2015, pp. 315-332. Project MUSE, muse.jhu.edu/article/589183.