The amount and sources of information, news, and opinions that we get exposed to every day have significantly increased thanks to online social media platforms. With these platforms serving as a mediator for sharing information between their users, multiple questions arise regarding how their design and function affect the information that reaches an end-user.
In designing an online platform with an overwhelming amount of information flowing every day, it may make sense to design some personalization techniques to optimize the user’s experience. It might also make sense for example for an advertising platform to optimize the display of adds in a way that affects what an end-user see. There are many design goals that may result in the end-user receiving filtered information, however, the lack of transparency of these techniques to end-users, as well as the effect of filtering information on the quality and diversity of the content that reaches a user are significant concerns that need to be addressed.
In their work Exposure to ideologically diverse news and opinion on Facebook, Eytan Bakshy et al. study the factors affecting the diversity of the content that Facebook users are exposed to. They conducted a data-driven approach to analyze the proportion of content from a different ideology versus content from an aligning ideology that a user sees in his Facebook newsfeed. They inferred that the most contributing factors to limiting the diversity of the newsfeed content of a user are the structure of the user’s friends network and what a user chooses to interact with. The study found that the newsfeed ranking algorithm affects the diversity of the content that reaches a user, however, this algorithm is adaptable to the behavior and interactions of the user. From these perspectives, they concluded that “the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals” as mentioned in the paper.
I agree to some extent with the findings and conclusions of the study discussed above. However, one major concern is the question of to what extent are Facebook users aware of these newsfeed ranking algorithms? Eslami et al. try to answer this critical question in their work “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in the news feed. They conducted a qualitative study that included information from 40 Facebook users about their awareness of the newsfeed curation algorithms. The study showed that the majority are not aware of these algorithms and that there is a large misinterpretation of the effect of these algorithms among users. Although after becoming aware that Facebook controls what they see, a majority of users appreciated the importance of these algorithms, the initial response to knowing that these algorithms exist was highly negative, it also revealed how people are making wrong assumptions when for example not seeing a post from a friend for a while.
I’ll imagine myself as a mediator between users and designers of a Facebook-like social platform, trying to close the gap. I totally agree that every user has the complete right to know how their newsfeed work. And every user should feel that he/she is in full control over what they see and that any hidden algorithm is only helping them to personalize their newsfeed. On the other hand, it is a hard design problem for the platform designers to reveal all their techniques to end-users, simply because the more complex it becomes to use the platform, the more likely users will abandon it to more simple platforms.
If I imagine being hired to alter the design of a social platform to make users more aware of any hidden techniques. I would start with a very simple message conveyed through an animated video that raises the users’ awareness of how their newsfeed work. This could be by simply saying that “we are working to ensure you the best experience by personalizing your newsfeed, we would appreciate your feedback”. For having a user’s feedback, they could see occasional messages that ask them simple questions like “you’ve been interacting with x recently, to see more posts from x, you can go to settings and set this and that”. After a while, users will become more aware of how to control what they see on their newsfeed. Also taking continuous feedback from users on their satisfaction levels with their experience with the platform will help to improve the design over time.
I understand that it is more complex and challenging to address such a problem and that there may be hundreds of other reasons why there are some hidden algorithms that control what an end-user receives. However, ensuring a higher level of transparency is crucial to the healthiness and user satisfaction with online social platforms.