As a social media user, I want to warn people who follow me to be careful. For some reason, social media is not where you come to see your friends as it used to be the fun of it. Instead, I feel as though I am being fed information. I now see more outrageous accounts that spew hate and try to influence my beliefs to match what I see on platforms. My Facebook is filled with massive amounts of news from accounts that I do not remember following; they have very strong opinions towards specific topics that are very hateful to minorities and vulnerable groups in our society. I am trying to understand why this is happening, and I have been researching to give you information on how our behaviours are being collected. This is to educate the public to utilize the internet carefully as it is a shadow of its old self.
If you see on Twitter, everyone here can agree that we have at least ten bots followers. Most people do not get retweets; instead, they get that random one that gets you excited, and then you realize it’s a fake account again. The article discusses bots and how these bots essentially mimic our behaviours and can be utilized to push traction on news that aims to convince us for political gain (UC, 2022). These bots are created by people who are essentially employees at for-profit corporations that are paid to influence behaviour (UC, 2022). An example is in the last United States elections; the Russian Internet Research Agency hired people who could understand American culture and speak good English to push news, comments and like posts that rile up the population (UC, 2022). These accounts aim to polarize the American public to the point that they would undermine our government and put us on a path to a falling empire (UC, 2022). This is inherently the goal of bots and the cause for their growing number; it is so bad that 8.5% of Twitter accounts are bots (UC, 2022). This explains why you have that one loyal follower that gives you a retweet every time you tweet.
Another interesting journal by Bansal writes about the patchwork of policy working to fend off misinformation (Bansal, 2019). He discusses how the US election created doubt in American democracy, and many experts confirmed that foreign influences played a path in that election. Multiple groups on Facebook that had bots were influenced strongly (Bansal, 2019). Facebook had to roll out sweeping measures, which surprisingly worked and provided relief. However, there is still animosity on Twitter with the hate groups and negative/fake news getting more aggressive in micro-targeting. The new wave of social media sensation is a prime example of this; many young men and almost every person on the planet now know Andrew Tate. The individual was not someone I saw on my page a few weeks ago, but now I cannot stop seeing him after googling him once. It is horrible on Facebook and TikTok.
Governments have started releasing legislation to punish people who push fake news to sway the public. Canada, Singapore, France, Brazil and Egypt have some of the most severe punishments. Another example is the study “SCM,” which presented results on research on a Facebook platform, showing that most people use comments to gauge false and real news (Kluck, 1970). The growing number of bots can, therefore, either undermine factual information or uplift fake news in the worst-case scenario. The article by Beshai (2018) utilizes nodes to illustrate how deep and far false news travels compared to real news on Twitter; very stark imagery called cascades is formed. 3D tools demonstrate how broad and deep false and true news travel (Beshai, 2018). The study uses a visualizing this type of data where “breadth is (how many times a given tweet is retweeted) corresponds to the width of the tree, and depth (how many “generations” of retweeting occur) corresponds to the height” (Beshai, 2018). This imagery provides a more straightforward interpretation of the state of our situation. This is a wake-up call for the public to be wary and vigilant in consuming information.
I am not here to spread doom and gloom but to inform the public that this is real and could affect our society. Hence, I will provide insight into how Facebook, Twitter and other social media platforms can reduce how false news penetrates our system. The study called “Real Solutions for Fake News?” (2019) states that using disputed news tags can help inform the public of inaccuracy by about 10% from the base of 29% who believed initially. The other tag used in this study is false news which helped reduce the spread from the base of 29% to 16%, a 13% reduction (Clayton, 2019). I believe some corporate social responsibility is warranted, and companies should invest more in fact-checking to protect society.
Finally, I believe guides from the article “Fake News” will help reduce society’s vulnerability. Questions I recommend asking oneself are: “Does the article come from an established, credible and rigorously fact-oriented news organization such as ABC, The Guardian or The Wall Street Journal? If not, encourage students to consider the general character of the publication: how are its stories presented? Who owns the organization, and are they interested in promoting a particular view?” (Henry, 2020). Other possibilities are checking the URL link, checking for satire, using fact-checking websites, a simple google search and various types of bias (Henry, 2020). Confirmation bias is one that I found myself falling in tune with some news outlets. This could be as simple as stereotypes about a place, and then a story comes up that could run as evidence for the said stereotype. The last article shows how the use of fact-checking, either imposed or voluntary, reduces the spread of fake news by 25% (Chadwick, 2021). The study also highlights the reduction of false news sharing by 67% per viewer by educating them on fact-checking and increases their sharing of fact-checked news by 58% (Chadwick, 2021). Hence, I believe being vigilant in self-awareness can be a massive tool for navigating the internet in these times of upcoming mid-term elections.
Beshai, P. (2018, March 9). Cover stories: Visualizing the spread of true and false news on social … Retrieved November 9, 2022, from https://www.science.org/doi/10.1126/science.aat4382
Bansal, S. (2019, October 4). The patchwork of policy working to fend off misinformation. Centre for International Governance Innovation. Retrieved November 8, 2022, from https://www.cigionline.org/articles/patchwork-policy-working-fend-misinformation/
Clayton, K., Blair, S., Busam, J. A., Forstner, S., Glance, J., Green, G., Kawata, A., Kovvuri, A., Martin, J., Morgan, E., Sandhu, M., Sang, R., Scholz-Bright, R., Welch, A. T., Wolff, A. G., Zhou, A., & Nyhan, B. (2019, February 11). Real solutions for fake news? measuring the effectiveness of general warnings and fact-check tags in reducing belief in false stories on social media – political behavior. SpringerLink. Retrieved November 8, 2022, from https://link.springer.com/article/10.1007/s11109-019-09533-0
Chadwick, A., Vaccari, C., & Kaiser, J. (2021, March 17). The amplification of exaggerated and false news on social media: The roles of platform use, motivations, affect, and ideology. figshare. Retrieved November 8, 2022, from https://repository.lboro.ac.uk/articles/journal_contribution/The_amplification_of_exaggerated_and_false_news_on_social_media_the_roles_of_platform_use_motivations_affect_and_ideology/14223083
Henry, E., Zhuravskaya, E., & Guriev, S. (2020, June 4). Checking and sharing alt-facts. SSRN. Retrieved November 8, 2022, from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3597191
Kluck, J. P., Schaewitz, L., & Krämer, N. (1970, January 1). [PDF] doubters are more convincing than advocates. the impact of user comments and ratings on credibility perceptions of false news stories on social media: Semantic scholar. undefined. Retrieved November 8, 2022, from https://www.semanticscholar.org/paper/Doubters-are-more-convincing-than-advocates.-The-of-Kluck-Schaewitz/f32cfce6e8ec2331481a300085ff39e99ecaac6b
UC, S. B. (2022, November 8). How is fake news spread? bots, people like you, trolls, and. Center for Information Technology and Society – UC Santa Barbara. Retrieved November 8, 2022, from https://www.cits.ucsb.edu/fake-news/spread