Reflection #5 – [02-06] – [Patrick Sullivan]

Exposure to Ideologically Diverse News and Opinion on Facebook” by Bakshy, Messing and Adamic is exploring how social media sites like Facebook give political information to users and how it affects their political perspective.

This research shows the influence that social media can have on polarizing opinions on certain topics. Since social media algorithms are tuned to recommend content that the user ‘likes’, then most content will match up to the user’s political leaning, polarizing the political landscape even more.

Perhaps this could be remedied by adjusting the name ‘like’ to a different action word that has less problematic outcomes. Perhaps a user study could be used to both replicate this study and test out some alternatives and their effects. A breakthrough in this area would allow social media sites to create a more transparent showcase of what this new action name means. For instance, if users now ‘nod’ instead of ‘like’ articles on Facebook then users might take this as ‘they endorse this content as correct‘ instead of the currently existing ‘they like the content, want more of it, and want their friends to know they like the content‘.

This also is a good chance for social media sites to adjust algorithms to take into account political leanings, and to continually work on tuning a recommendation engine that is as unbiased as possible. This would have a great impact on the political exposure and perspective social media users experience, and could lead to more moderate views being expressed and more nuanced arguments being used as support.

In addition, I wonder if there is a measurable difference of user’s accuracy in self-reporting as a minority political party in countries where dissenting political opinion is actively repressed? Could this be used to determine if other countries have a repression of opinion that is implicitly known among the population instead of explicit and visible political threats?

Echo Chambers Online?: Politically Motivated Selective Exposure Among Internet News Users” by Garrett is an investigation into if users prefer support of their political stance over counterarguments to an opposing argument.

I believe that this result might be caused by users having a stronger background on the information their political party promotes, and thus have a better understanding of the support article that the user appreciates? Participants in the study could be finding it difficult to parse the article that counters the view of an political opponent, since they could be less familiar with that viewpoint.

Could the user who only selects articles that support their viewpoint be considered self-censorship? Forcing users out of this behavior would likely qualify as a violation of freedom of speech and freedom of press. Perhaps incentivizing users to read articles that contain a conversation between opposing political perspectives or a read from a less biased news source is more viable.

Leave a Reply

Your email address will not be published. Required fields are marked *