In its early stages, social media was thought to be a revolutionary aid for democracy. Creating enormous networks of connections and facilitating the spread of ideas and discussion of opinions was the optimistic outlook of how social media would connect the world(Haidt, Rose-Stockwell, 2019). However, as we move away from the “unprecedented democratization of writing” of blogs (Derakhshan, 2015), we move into an era dominated by social media giants like Twitter and Facebook. As we advance into this new digital world, it becomes clear that the impacts of social media on democracy are much more complex and multifaceted than we had initially expected. The anticipated network of connections is often hindered by filter bubbles. These filter bubbles, combined with the spread of disinformation makes for highly polarized political extremes. When worlds collide, it’s often aggressive – far from the harmonious discussion which had once been hoped for.
These “filter bubbles”, a termed coined by Eli Pariser in a 2011 Ted Talk, are the algorithms which show the reader information which reinforces one’s current perspective, without introducing the opposing point of view (Derekhshan, 2015). With algorithms working to polarize you to further extremes, it is easy to get entrenched in one’s view point. As Mod states in his article “How I Got My Attention Back”, humans have become easily manipulated by these algorithms, and are generally speaking, indifferently synchronous to what we are consistently exposed to on social media (Mod, 2017).
The unprecedented rapid spread of disinformation is particularly pertinent now, during the covid-19 pandemic, but also consistently relevant to politically charged debates. Often, these social media disinformation campaigns are to support authoritarian and far-right parties. This is achieved by stirring up panic about minority groups, manifesting mistrust in mainstream media, and generating falsehoods about their political oppositions (Beauchamp, 2019). The dissemination of disinformation, more disaffectionately referred to as “fake news”, is spread by taking advantage of the algorithms which promote highly “engaging” content. Content which is outrageous and “pushes boundaries” so to speak engages the most interactions from social media users, thus it is propagated across the web and shown to more users (Haidt, Rose-Stockwell, 2019).
To fan the flames further, those using social media, engaging in egregious political debates, and disseminating false information are wholly disinhibited by invisibility, asynchronicity, and dissociation of online behaviour from “real life” (Suler, 2004). This makes for political discussions which are more aggressive and polarized than we might see in real life. On the other hand, separating social media and politics is not feasible, nor advisable.
Many (if not all) politicians have come to rely on social media in some capacity to relay information about their campaigns to the general populace. In the same sense, many members of the general public rely on some stream of social media to remain politically informed. Thus, removing politics or politicians from social media (while it would certainly remove a lot of aggression), would also leave millions uninformed about the politics of their nation. Moreover, the general public often uses social media platforms to discuss the actions of their leaders among themselves, and when unsatisfied, use social media to confront their leaders with demands for change. Removing this power from the people would be unacceptable, and an assault on the freedom of speech of the people.
A recent example of the public holding their figure of authority accountable is the swift and harsh response to Ted Cruz attempting to flee to Cancun during the weather emergency in his home state of Texas. When the democratic and leftist communities of Twitter found out about him abandoning his people (and travelling internationally during a global pandemic), the communal outrage was such that he returned within hours. This “cyberbullying” as many called it of a US senator is a clear example of minimization of authority (Suler, 2004). Without the connectedness of the digital social media community, it is impossible that such pressure could be exerted on someone with Cruz’s power and influence. Indeed, it is impossible that millions of people would even be aware of a politicians’ actions within hours of them occurring.
This newly essential line of communication between the people and their figures of authority is perhaps part of the reason as to why Donald Trump was allowed to remain active on his favourite soap box for so long. While his actions were certainly violating the terms of conditions of Twitter, removing the active president of the united states from social media is an action which holds a lot more weight than removing a simple “troll” with 300 followers.
In conclusion, we are a far cry from the optimistic future we had envisioned when social media was in its infancy. Filter bubbles, disinformation, and toxic disinhibition plague our pages and creep through our content. However, we must bravely forge onwards into the unknown, because after all, what other choice is there?
Beauchamp, Z. (January 2019). Social Media is Rotting Democracy from within. Retrieved from: https://www.vox.com/policy-and-politics/2019/1/22/18177076/social-media-facebook-far-right-authoritarian-populism
Derakhshan, H. (July, 2015). The Web We Have to Save. Retrieved from https://medium.com/matter/the-web-we-have-to-save-2eb1fe15a426
Haidt, J., Rose-Stockwell, T., (December 2019). The Dark Psychology of Social Networks. Retrieved from: https://www.theatlantic.com/magazine/archive/2019/12/social-media-democracy/600763/
Mod, C. (January 2017). How I Got My Attention Back. Retrieved from https://www.wired.com/2017/01/how-i-got-my-attention-back/
Pariser, E. (March 2011). Eli Pariser: Beware online “filter bubbles” [Video File]. Retrieved from: https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/up-next?language=en
Suler, J., (2004). The Online Disinhibition Effect, CyberPsychology & Behaviour 7(3), 321-326 DOI: 10.1089/1094931041291295