- Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132.
- Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., … & Sandvig, C. (2015, April). I always assumed that I wasn’t really that close to [her]: Reasoning about Invisible Algorithms in News Feeds. In Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 153-162). ACM.
Reading reflections:
Most of us have been wondering at some point if a friend who doesn’t show up on our Facebook news feed anymore, blocked or restricted us? At times, we forget about them until they react on some post on our timeline, bringing their existence into notice.
People are becoming more aware of some mechanism used to populate their news feed with stories from their friends, the groups they have joined and the pages they have liked. However, not all of them know whether the displayed content is just randomly selected and displayed, or if there is a more sophisticated way of not only arranging and prioritizing what is displayed, but also filtering out what is “deemed” unnecessary or uninteresting for us by Facebook, which is by using a curation algorithm.
- There needs to be some randomization in what is displayed to us to break the echo chambers and filter bubbles created around us. It applies both to news we want to read as well as stories displayed in the news feed. Just like going to Target to get a water bottle and finding an oddly placed but awesome pair of headphones in the aisle. One might not end up buying it, but it will certainly catch the attention and might even lead you to the electronics section to explore around.
- As regards to political news, not all people choose to read only what is aligned with their ideologically. Some people prefer reading the opposite party’s agenda, if only to pick points to use against the opponent in an argument, or simply to be in the know. Personalizing the news displayed to them based on what they “like” may not exactly be what they are looking for, whatever the intention for reading that news may be.
- Eslami et. al. talk about the difference in acceptance of the new knowledge with some users demanding to know the back story, while more than half (n=21) ultimately appreciating the algorithm. While some users felt betrayed by the invisible curation algorithm, knowing about the existence of an algorithm controlling what is displayed on their news feed overwhelmed some participants. This sounds true for some elderly people who haven’t been social media users from a long time, or users who are not very educated. Authors also talk about future work in determination of optimal amount of information displayed to users “to satisfy the needs of trustworthy interaction” and “protection of propriety interest”. An editable log maintaining the changes to news feed content made by hiding a story or lack of interaction with a friend’s/page’s/group’s story etc., which is accessible to user if only he chooses to see it seems to be a reasonable solution to this issue.
- I liked the clear and interesting narrative of participant selection to data analysis in the second paper, especially after reading the follow up paper [1]. I do think there should have been more information about how participants reacted to stories missing from the groups they follow or pages they’ve liked, or about the extent to which they preferred keeping them as displayed to them. It would’ve given some useful insights into their thought process(or “folk theories”) about what they think goes on with the news feed curation algorithm.
[1] Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K., Rickman, A., Hamilton, K., & Kirlik, A. (2016, May). First i like it, then i hide it: Folk theories of social feeds. In Proceedings of the 2016 cHI conference on human factors in computing systems(pp. 2371-2382). ACM.