Monthly Archives: October 2016

Essay #1

The Cycle of Confirmation

Zoe Vedova –



You want your opinions to be right, but why stop there? What if there was a place where you could always be right, where your beliefs were continuously validated and there was an endless supply of proof supporting your claim; Welcome to Facebook. Facebook is an omnipotent social media platform which collects data about its user’s online habits, then places misleading or false news articles onto people’s newsfeed; with few users being able to distinguish fact from sponsored fiction, this confirmation of bias (Norman, Lecture week 6) has far reaching consequences into people’s lives, and democracy on a whole. This paper will examine how confirmation bias works through Facebook’s algorithms, and the risk it poses to individual people, as well as farther into society. Forty-five percent of American’s get their news from Facebook (Shearer, Gottfried, 2017), giving the website unprecedented power to curate what news people see, shaping world views by trapping users in heavily filtered bubbles.


Scrolling through Facebook is not as innocuous as it seems; every image you hover over and every article you share is being recorded. Even your location is prime data for Facebook to sell to ad companies to help that company sell back to you. This process is the first step in steering up a confirmation of bias. This works through a myriad of ways, detailed by Barbara Ortutay, in the INC. online magazine article, “Tips on how to avoid Facebooks new ad tracking” (2014). Ortuay described the process, starting with how advertisers pick from a series of “attributes such as age, gender … and language” to best target the user. From there, Facebook works in affiliation with “outside analytical firms” to track what websites and apps a user goes to in a process known as “interest-based targeting.” For instance, if a user was to click on a Facebook news story which linked out to Breitbart, a far-right American news site which endlessly propped up Trump with misleading articles, Facebook would collect that information and use it to show ads related to your ‘interests.’ Perhaps a product from the Breitbart website would appear on your news feed – something such as a, Safe spaces are for snowflakes, bumper sticker, or even more sinister, a false news story fed to them simply because it matches the user’s pervious interests. The user is now trapped in a cyber world of their own creation, constantly re-affirmed by Facebooks algorithms.


The cycle of confirmation bias doesn’t end with far-right neo-Nazi sympathizers, imprisoned in a hate bubble. Out of the 245.3 million adults living in America, PEW research center deduced that 45% of them get their news solely from Facebook (Shearer, Gottfried, 2017). 110.4 million Americans lives in danger of being lead into false articles. Although sixty-four percent of American’s believe that false news leads to real-life confusion, thirty-nine percent claim they can recognize fake articles online, thus negating the negative effects (Barthel, Mitchell, Holcomb, 2016). However, a 2016 study conducted by Stanford Graduate School of Education found that eighty percent of participants were unable to separate sponsored content from real news stories (Donald, 2016). With half of American’s using Facebook as the primary source of news, and many unable to differentiate between real and false, the cycle of confirmation bias only grows stronger, and more dangerous as it leaves peoples personal lives and affects public decisions, such as who to vote for.


The 2016 American election steals a lot of limelight as the poster child for the hazards of Facebook’s fake news and confirmation bias. However, the “electoral battle ground” Facebook provides utilizes the same “media filter bubbles and algorithms” which effect every election, as well as our collective sense of democracy (Hern, 2017). If a Facebook user cannot distinguish between a real and sponsored story about veganism, the misinformation that could spread within their personal lives is fairly benign. More malignant is false news centered around governments and political candidates. The Journal of economic perspectives published an article from Stanford university on fake news affecting elections, detailing how fake news makes it increasingly difficult to “infer the true state of the world” (Allcott, Gentzkow, pg.2, 2017). In fact, the article goes on to suggest that “Donald trump would not have been elected president were not for the influence of fake news” (pg.2). Fake news, perpetrated by algorithms trapping users in a cycle of confirmation bias pose a massive threat to life off the internet.


Facebook has a multitude of tactics used to continuously confirm biases as a news source. Whether its algorithms to track a user’s interests, or filters to only show high paying sponsored content, there is real danger in people not being able to distinguish between real and false, dangers that could even threaten our sense of democracy. Facebook must either accept responsibility for the effects of over-curating content, or users must learn how to spot fake news for themselves. Society needs to be able to trust our published news for a democracy to thrive.















Allcott, Hunt, Gentzkow, Mathew. (2017) Social media and fake news in the 2016 election. Journal of economic perspectives, Vol 31. (Issue 2), pp. 211-236)


Barthel, Michael, Mitchell, Amy, Holcomb, Jessie. (2016, December, 15th) Many American’s believe fake news is sowing confusion.
Retrieved from:


Donald, Brooke. (2016, November, 22) Stanford researchers find students have trouble juging the credibility of information online.

Hern, Alex. (2017, May, 22) How social media filters bubbles and algorithms affect the election.
Retrieved from:


Ortutay, Barbara. (2014, June, 20th) Tips on how to avoid Facebooks new ad tracking
Retrived from:


Shearer, Elisa, Gottfried, Jeffrey. (2017, September, 7th) News use across Social Media platforms 2017
Retrieved from:





#5 – Changes

This Tuesday marked the second peer review of the course.
I reviewed Jenny Chan’s blog, which you can find up on my website under PUB 101, and they returned the favour. This review was focused on design, primarily the intuitivness, and consistency of the blog in terms of overall flow and font type.

The reviews are always helpful, I really like knowing what people like to see, and what they think needs work. Jenny was in full support of my blogs aesthetic and pictures, writing that they made her feel my blog was where she belonged, which made me happy to read.
She went on to point out that that word Cusine was spelt wrong in my tag line, (not at all surprising given my awful spelling track record) so I went back into the customize section of my blog and fixed it. At her suggestion, I also tried to change the font of the tagline, only to find that my theme does not allow it. At a later date I’ll try to write the code myself for a new font. Instead, I’ve given my blog a bit more identity with a website icon. The one I have saved now isn’t my favourite though, so I’m still on the lookout.

#4 – Public and Fysh

A constant daydreamer. A bad sense of direction. An affinity towards chocolate chip ice cream.

If there is any audience for this website, and honestly I would love for there to be, that would be our collective group identity. Of course, this being the internet, the author has almost no control over who interacts, reads, or posts on this webpage. I could luck out and gather my perfect following of people who appreciate Fysh and her sardonic posts satirically commenting on sanctimony in the realm of the Armageddon, or I could accidently attract the neo-nazi underbelly of the internet who feed off anything that could’ve have been written by Freud if he tweeted like Trump.
Thankfully, that’s not likely to happen.
I’ve attempted to format my website so the process posts are farthest away from the readers view. I want people’s full attention on the apocalypse posts. I’ve also tried to keep the categories and pages as easy to navigate as possible, partially so I don’t get confused and misplace something, but also so the website is at maximum accessibility to all. As long as people can find what they need intuitively, I’m happy not being to fancy.
(This is most likely my audience)


#3 – Tumblr trash

Welcome to Tumblr. 
A micro-blogging platform promising a hipper version of Reddit, and a geekier version of Instagram; luring in every subculture in between, ultimately culminating in a complete mess of internet ideologies and etiquette, or lack thereof.
And I would know – I still managed to waste time every week scrolling through the endless myriad, pacifying myself with memes. This website’s overdramatic ridiculousness is not lost on its users, who maintain the endless cycle of producing and consuming each others work.

Victor Kaptelinin, in writing the Encyclopedia of human-computer interaction held up intuitiveness as a website’s most crucial aspect. Your website may be the first impression a visitor gains of a company or organization, and Kaptelinin explains that how well that group “exploit(s) the power of perception” can secure or lose a potential client.
Tumblr’s overall usability is fairly high. (Setting up a basic blog was something managed to pull off in grade eight, so it must’ve been straightforward) Instagram took from Tumblr’s general idea that you have a blog, you follow other blogs, and view their content on your dashboard, as shown below. The tagging system is also extremely thorough, as you search all posts through tags much like twitter.

From this image of my dashboard, it’s easy to spot the search bar top right, the message and profile buttons top left, as well as all the types of posts you can make, laid out top center.

The only place this gets complicated is on a mobile. While the website is calibrated to the app, the app often doesn’t post your content properly at the right time, takes a long time to load, and is generally less intuitive then the desktop version. This is unfortunate, as most users need the mobile version more.

When Mauve Pagé presented in lecture week 4, she talked about how font impacted perception of a website as well. The small, rounded font the Tumblr logo uses is inviting as it is nonthreatening. The theme of easy, casual enjoyment is kept consistent in how the blog post corners are rounded too.

Tumblr’s other failings come in the form of blocking, messaging, and newly installed advertisements.
Blocking a person’s posts or messages is an incredibly important feature, especially in our days of cyberbullying, however Tumblr makes it incredibly hard to blacklist anyone.
The messaging system is oddly rudimentary for an advanced website, as it just updated into instant messaging.
Travis Gertz reading offers up an answer for why the ads on this website are so frequent in the dashboard; “Crap content selling crap.” Tumblr isn’t inherently a place to sell goods, so Yahoo had to make money somehow. Apparently by filling dashboards with the strangest clickbait ads imaginable.
Using crap ads to sell us what appear to be crap scam services.


All in all, Tumblr’s a decent website, and I hope to use the design techniques I’ve learned from these readings to increase my own website’s usability.