Papers –
[1] “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in the news feed – Eslami et. al.
[2] Exposure to ideologically diverse news and opinion on Facebook – Bakshy et. al.
Summary –
[1] is a qualitative paper, discussing the awareness, impact and attitudes of end users towards “invisible” algorithms. Focusing on the Facebook Newsfeed curation algorithm, it tries to answer three research questions; whether the users are aware of the algorithm, what is their opinion of the algorithm, and how their behavior/attitudes changed long-term after being made aware of the impact the algorithm has. It finds that 62.5% of users were not initially aware that content on their feed was curated, and the reaction to finding out how much their feed was being altered ranged from surprised and angry initially, and that 83% of users reported that their behavior had changed in subsequent months, although their satisfaction with the algorithm remained the same afterwards, as it was before.
[2] is a short, quantitative paper which discusses the interactions between politically heterogenous groups on Facebook, and the extent of “cross-cutting”(i.e. exposure of posts belonging to the opposite ideological camp) of content belonging to either side.
Reflection –
Firstly, it must be noted, (as have the authors of the paper) that the sample size of the interview takers in [1] was very small and quite biased. The results could be made stronger by being replicated with a larger and more diverse sample size.
An interesting statement made by [1] was the fact that users make a “mental model” of the software, as if it works by some consistent, universal internal logic, which the users inherently learn to interpret, and abide by, e.g. the inference: if a group’s feed is curated, then there’s no reason a user’s feed should also not be curated. However, of course, this does not happen automatically, and is up to the developer to manually enforce. This highlights for me the importance of having an understanding of which “mental models” users will make, and not implement functionality which might cause them to make inaccurate mental models, and thus inaccurate inferences about using the software.
Another interesting observation made by [1] is likening the use of “hidden” algorithms which guide user behavior without them noticing to the design of urban spaces by architects. This of course, was talked about in depth in the video The Social Life of Small Urban Places by Whyte which was shown in class earlier this semester.
[1] states that most users, upon being questioned after some months after taking the survey, were just as satisfied with their newsfeeds, but it also says that users on average moved 43% of their friends from one category to another when asked to switch friends between the categories “Rarely Shown”, “Sometimes Shown”, and “Mostly Shown” for their newsfeed. This indicates a sort of paradox, where users are satisfied with the status quo, but would still drastically alter the results given a choice. This might imply a sort of resigned acceptance of the users to the whims of the algorithm, knowing that the curated feed is better than the unedited mess of all their friends social media posts.
[1] ends by making a comment about the tradeoff between usability and control, where the developers of a software are incentivized to make software usable, at the cost of putting power out of the users hands. This is observed outside social media platforms too. Any software which gives too much control/customizability has a steep learning curve, and vice versa. This also brings up the point, how much control do users deserve, and who gets to decide that?
[2] focuses on the extent of interactions that happen between users who hold different political beliefs. It finds that there is a roughly 80/20 split between friends of the same ideology and friends of a different ideology. It makes the claim that ideologically diverse discussions are curtailed due to homophily, and that users themselves, despite being exposed on average to ideologically diverse material, by their own choosing, interact with posts they themselves align with.
[2] also finds that conservatives share more political articles than liberals. I wonder whether this is because of something inherent in the behavior/mentality of conservative people, or due to a trait of conservative culture.
[2] uses only political beliefs as the separator, treating sport, entertainment, etc. as neutral. However, sport is also subject to partisan behavior. There could be a study along the same lines, but using rival sports teams as the separator.