Reflection #5 – [09/10] – [Lindah Kotut]

  • Motahhare Elsami et. al. “‘I always assumed that I wasn’t really that close to [her]’: Reasoning about invisible algorithms in the news feed”.
  • Eytan Bakshy, Solomon Messing and Lada A. Adamic. “Exposure to ideologically diverse news and opinion on Facebook”

Reflection: 
1.Ethics
Facebook study is apt, given the recent New Yorker spotlight on Mark Zuckerberg. The piece while focusing on Zuckerberg and not the company, gives a good insight on the company ethos — that also give context to Bakshy’s claims . Considering the question of invisible algorithm: Elsami’s paper addresses it directly in outlining the benefits of not making the consequences of changes in  algorithm public, not the algorithm itselfGiven the anecdotes of users who changed their mind about which users the’d like to hear more of, this is a good decision — allowing for the sense of control and trust in the algorithm curation process. Elsami’s paper proceeds to raise the concern about the effect of what the unknowns have on the decision making: When considering the (in)famous Nature paper on the large-scale experimental social vs informational messaging in affecting election turnouts and the other infamous paper on experimenting on information contagion especiall, both used millions of users’ data raise the issue of Ethics. Under GDPR for instance, Facebook is obligated to let the user know when and how their data is collected and used. How about how when the information is manipulated? This question is explicitly considered by Elsami’s paper where they found users felt angered (I thought it was betrayal more than anger from the anecdotes) after having found out design decisions that had a real-life impact — explicitly: “it may be that whenever software developer in Menlo Park adjusts a parameter, someone elsewhere wrongly starts to believe themselves to be unloved.”  

2. Irony
Bakshy’s considers their work as a neutral party in the debate about whether (over)exposure to politics is key to a healthy democracy, or whether they lead to a decreased level of participation in democratic processes. They then conclude with the power to expose oneself to differing viewpoint lies in the individual. Yet Facebook curates what a user sees in their newsfeed, and their own research showed that contentious issues promote engagement, and that engagement raises the prominence of the same content — raising the chances of a typical user viewing it. They attempt to temper this in defending the nature of the newsfeed to be dependent on the users logging/activity behavior, but this goes to show that they place the onus again on the user again … to behave in a certain manner for the algorithm to succeed and obtain consistent data?

3. Access, Scale and Subjectivity
I found it interesting about how the two papers sourced the data. Elsami et al, though they had access to respondents data, still had to deal with the throttle imposed by Facebook API. Bakshy’s on the other hand had millions of data, anonymized this disparity does not present a threat on the validity of the study, it’s just a glaring point. It would be interesting if Elsami’s work could be scaled to a larger audience — the interview process is not very scalable, but elements such as users’ knowledge on the effects of the algorithm is especially important to know how well it scales.

The issue of subjectivity manifested differently in these two works: Elsami was able to probe users on personal reasons for their actions on Facebook, giving interesting insights about decisions. Bakshy’s work regarded the use of sharing of content as a marker of ideology. What of sharing for criticism, irony, or reference?  (From what I understood, alignment was measured from the source – and click of shared link, rather than also including the commentary on the measurement). The reasons why posts are shared range from support to criticism in two extremes, and the motivation beyond the sharing makes a consequential difference in what we can conclude based on engagement. The authors note this in both the source of data (from self-reported ideological affiliation) and in their vague distinction between exposure and consumption.

Leave a Reply

Your email address will not be published. Required fields are marked *