Readings:
[1] “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in the news feed
Summary:
This paper focused on a plethora of items regarding our digital life and the ubiquitous curation algorithms. The authors talk about varying awareness levels in different users, pre study conception of facebook newsfeed, post study conception of new feed, participants reaction to finding out about a hidden curation algorithm and how it changed the perception of the participants of the study. In order to show the difference between raw feeds and curated feeds the authors of the paper decided to create a tool called FeedVis that would display users unfiltered feed from friends and pages. By asking open and close ended questions the authors were able to gauge understanding levels of the curation algorithm, the users had. Authors of the paper tried to answer three different research questions within one paper, and were successful in delivering adequate answer for future work.
Reflection/Questions:
It was interesting for me to read that various users had started actively making an effort towards manipulating the algorithm, especially because I am aware of it and it doesn’t bother me at all. In the initial part of the paper the authors discuss the idea about disclosure of the mechanisms of curation algorithm in order to create a trust bond between the users and the platform, howbeit I would argue that if the working mechanism of curation algorithm is made public then trolls, fake news agencies, and other malicious actors could use such information to further increase the reach of their posts/propaganda. The authors also describe their participants as “typical facebook user”, which I would disagree with this statement because the meaning of a “typical” facebook user is fluid — it meant something different a few years ago (millennials) and now means something different (baby boomers and generation x). According to me facebook should some days show users unfiltered results, other days show them curated results; then track their activity online — if there is an increase/decrease in user activity, for instance likes/comments/shares — then from that extensive data they should decide if the user would prefer curated results or unfiltered results. Facebook should also give users the option to let the algorithm know which friends/pages the specific would be more interested in — this might also help the algorithm learn more about the user.
[2 ] Exposure to ideologically diverse news and opinion on Facebook
Summary:
The authors of this paper focused on understanding how various facebook users interact with news on social media, the diversity of news spread on facebook in general and diversity of news spread among friend networks. Authors also studied the kind of information that the curation algorithms decide to display to a user and how does selective consumption of news affect the user. The authors also explained that selective consumption is a combination of two different factors: people tend to have more friends with same ideologies so they see reinforcing news, and the curation algorithms tend to display what it thinks the user would like the most — which is news reinforcing its ideologies ( I would argue that this is the reason fake news will never die).
Reflection/Questions:
According to me people with a certain ideological stand point will never be able to fathom the other side hence [for the most part] will never even put an effort into reading/watching news relating to a different ideological point of view. Historically we can see this in cable television, conservative people tend to watch Fox more often , moderates/ liberals tend to watch CNN. Each of these channels also understood their user base and delivered content bespoke to them. Now instead of companies determining news content, it is a curation algorithm that does it for us. I don’t think this is something that needs to be fixed or a problem that needs to be tackled (unless of course it is fake news). It is basic human psychology to find comfort in familiar and if users are forced to digest news content they are unfamiliar with, it will, on a very basic level make them uncomfortable. I also think it will be crossing the line when developers try to manipulate a users news feed, in a way that is not consistent with their usage of facebook, their friend circle and the pages they follow.