Reflection #5 – [02/06] – Aparna Gupta

  1. Garrett, R. Kelly. “Echo chambers online?: Politically motivated selective exposure among Internet news users.” Journal of Computer-Mediated Communication2 (2009): 265-285.
  2. Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. “Exposure to ideologically diverse news and opinion on Facebook.” Science6239 (2015): 1130-1132.

Both papers talk about how the exposure to news and the civic information is increasing through online social networks and personalization. The emphasis is more on how this is leading to an era of “echo chambers” where people read news or information which favors their ideology and opinions.

Garrett et al. demonstrated that opinion-reinforcing information promotes news story exposure while opinion-challenging information makes exposure only marginally less likely. They conducted a controlled study where the participants were presented with news content and a questionnaire. However, I am not convinced by the fact that the participants were presented with the kind of news or information they already have strong opinions about. This could have led to a possible bias in the conclusion drawn from the study. Although the paper has presented some interesting findings of opinion-reinforcing and opinion-challenging content and how readers perceive information when presented with such content, I was unable to correlate the claims and findings specified by the authors. Also, the study revolves around three issues – gay marriage, social security reform, and civil liberties- which were current topics in 2004. Does this mean that the results presented won’t generalize to other topics? Of all the papers we have read so far, generalizing the results across genres and other geographic location looks like a major roadblock.

Bakshy et al., have used deidentified data to examine how 10.1 million Facebook users interact with socially shared news. Their focus is on identifying how heterogeneous friends could potentially expose individuals to cross-cutting content. Apart from “echo chambers” the authors also talk about “filter bubbles” in which the content is selected by algorithms according to viewer’s previous behaviors. I like the quantitative analysis presented by the authors to compare and quantify the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook’s algorithmically ranked Newsfeed. Apart from this, in my opinion “how likely is it that an individual will share a cross-cutting post with his friends” should also be considered and “what if an individual doesn’t click on the link containing a cross-cutting post?

In the end, it makes me wonder how the results will be if authors of both papers would have conducted the study on individuals from outside of the US.

Read More

Reflection #5 – [02/06] – [Ashish Baghudana]

Garrett, R. Kelly. “Echo chambers online?: Politically motivated selective exposure among Internet news users.” Journal of Computer-Mediated Communication 14.2 (2009): 265-285.
Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. “Exposure to ideologically diverse news and opinion on Facebook.” Science 348.6239 (2015): 1130-1132.

Summary

The central theme in both these papers talks about echo chambers on the Internet, and specifically, social media websites, such as Facebook. The fundamental premise that the authors build on is that – likes attract, opposites repel. Garrett mentions selective pressure as the main driver behind people’s news choices on the Internet. Algorithms in social media websites and recommender systems suggest similar content to users that drives them towards an extreme, i.e. opinions are reinforced and differing opinions are not read or recommended.

However, the papers differ in their methods. Garrett et al. conduct a small and specific user behavior-tracking study (N = 727), with users recruited from readers of two news websites – AlterNet (left-leaning) and WorldNetDaily (right-leaning). Since readers already visit these websites, ground truths about their political affiliation are assumed. Once these users sign up for the study,  Bakshy et al. perform similar analysis on a significantly larger scale of 10.1 million active users of Facebook who self-report their political affiliations. Their evaluation involves sophisticated data collection involving ~3.8 billion potential exposures, 903 million exposures and 59 million clicks. The paper by Bakshy et. al is very dense in content and I refered to [1, 2] for more explanation.

Both papers conclude by confirming our suspicions about content preference amongst users – they spend more time reading opinion-reinforcing articles over opinion-challenging opinions.

Reflections

Kelly Garrett’s paper, though published in 2009, uses data collected in 2005. This was a time before global social media websites like Facebook and Twitter were prevalent. At the time, the author chose the best means to generate ground truth by looking at left-leaning and right-leaning websites. However, this mechanism of classification feels naïve. It is possible that certain users merely stumbled upon the website or participated in the survey for the reward. Equally importantly, the sample is not truly reflective of the American population, as a vast majority may look for news from a more unbiased source.

One of the undercurrents of the “Echo chambers online?” paper is the effect of Internet in making these biases profound. However, the study does not speak or attempt to measure users’ preferences before the advent of Internet. Would the same citizenry buy newspapers that were partisan or is this behavior reflective only of news on the Internet?

Bakshy et al.’s paper is considerably more recent (2015). While it is evaluating many of the same questions as Garrett’s paper, the methodology and mechanism are fundamentally different, as is the time period. Therefore, comparing the two papers feels a little unfair. Facebook and Twitter are social platforms, and in that sense, very different from news websites. These are platforms where you do not fully choose the content you want to see. The content served to you is an amalgamation of that shared by your friends, and a ranking algorithm. However, a distinction must be made between a website like Facebook and one like Twitter. The authors themselves highlight an important point:

Facebook ties primarily reflect many different offline social contexts: school, family, social activities, and work, which have been found to be fertile ground for fostering cross-cutting social ties.

Therefore, it is substantially more possible to interact with an opinion-challenging article. However, interaction is sometimes poorly defined because there is no real way of knowing if a user merely looked at the article’s summary without clicking on it. Hence, tracking exposure can be tricky and an avenue for further research.

Questions

  1. Almost 10 years later, especially after the wave of nationalism across the world, is there more polarization of opinion on the Internet?
  2. Is polarization an Internet phenomenon or are we measuring it just because most content is now served digitally? Was this true back in 2005?
  3. Can and should recommendation algorithms have individual settings to allow users to modify their feed and allow more diverse content?

References

[1] https://solomonmessing.wordpress.com/2015/05/24/exposure-to-ideologically-diverse-news-and-opinion-future-research/

[2] https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/LDJ7MS

Read More

Reflection #5 – [02/06] – [John Wenskovitch]

Both of these papers examine the trend towards “selective exposure” and away from diverse ideological exposure when viewing news electronically.  At a high level, both papers are touching on the same big idea – that users seem to be creating insular “echo chambers” of polarized news sources based on their ideals, ignoring the viewpoints of the opposing ideology either by their own conscious choice or algorithmically by their past behavior.  The Garrett paper looks at general web browsing for news sources and focuses on the area of opinion reinforcement.  The study details a web-administrated behavioral study in which participants were shown a list of articles (and their summaries) and were given the choice of which ones they wanted to view.  The study findings supported the author’s hypotheses, including that users prefer to view news articles that are more opinion-reinforcing and that users will spend more time viewing those opinion-reinforcing articles.  The Bakshy et al. study was Facebook-centered, examining how users interact with shared news articles on that platform.  Among their findings were that ideologically cross-cutting depended on both the spectrum of friend ideologies and how often those friends shared, but that there was some evidence of ideological isolation in both liberal and conservative groups.

Both of these studies had notable limitations that were discussed by the authors, but I felt that each was addressed insufficiently.  The Garrett study made use of both a liberal and a conservative online news outlet to obtain participants, which obviously will not ensure that the sample is representative of the population.  Garrett justifies this by supposing that if selective reinforcement is common in these groups, then it is likely the same among mainstream news readers; however, (1) no attempt is made to justify that statement (the brief mention in the Limitations section even contradicts this assertion), and (2) my intuition is that the opposite is true: that if selective reinforcement is common among centrists, then it almost certainly will be true at the ideological extremes.  In my opinion, the results from this study do not generalize, and this is a killer limitation of the paper.

Bakshy’s study has a similar limitation that the authors point out: that they are limited to recording engagement based on clicks to interact with articles.  As a result, individuals might spend some time reading the displayed summaries of some articles but never click to open the source, and such interactions are not logged.  To use the authors’ phrasing, “our distinction between exposure and consumption is imperfect.”  This surprised me – there was no way to record the amount of time that a summary was displayed in the browser, to measure the amount of time a viewer may have thought about that summary and decided whether or not to engage?  I know in my experience, my newsfeed is so full and my time is so limited that I purposefully limit the number of articles that I open, though I often pause to read summaries in making that decision.  I do occasionally read the summaries of ideologically-opposing articles, but I rarely if ever engage by clicking to read the full article.  Tracking exposures based on all forms of interaction would be an interesting follow-up study.

Despite the limitations, I thought that both studies were well-performed and well-reported with the data that the authors had gathered.  Garrett’s hypotheses were clearly stated, and the results were presented clearly to back up those hypotheses.  I wish the Bakshy paper had been longer so that more of their results could be presented and discussed, especially with such a large set of users and exposures under study.

Read More

Reflection #5 – [02/06] – [Meghendra Singh]

  1. Garrett, R. Kelly. “Echo chambers online?: Politically motivated selective exposure among Internet news users.” Journal of Computer-Mediated Communication2 (2009): 265-285.
  2. Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. “Exposure to ideologically diverse news and opinion on Facebook.” Science6239 (2015): 1130-1132.

Both the papers discuss about online “echo chambers” or communities/groups/forums on the Internet, that are devoid of differing viewpoints, i.e. places where individuals are exposed only to information from like-minded people. The second paper also talks about “filter bubbles” or the behavior of content-delivery services/algorithms to only deliver or recommend content to users based on their viewing history. Both of these issues are important as they can give rise to fragmented, opinionated and polarized citizenry. While Garrett’s paper mostly focused on the analysis of behavior-tracking data collected from the readers of 2 partisan online news websites, Bakshy et. al. analyzed de-identified, social news sharing data of 10.1 million Facebook users in the U.S.

The results presented in Garrett’s paper suggest that individuals are more likely to read news stories containing “high” opinion-reinforcing information as compared to “high” opinion-challenging information. Additionally, people generally tend to spend more time reading news stories containing “high” opinion-challenging information as compared to those containing “high” opinion- reinforcing information. While reading the paper I felt that it would be interesting to study, how reading opinion-reinforcing news affects the reader’s opinion/attitude versus reading news that conflicts with the reader’s attitude. While both the studies focused on political news which in my opinion can have a wide range of debatable topics, I feel it would be interesting to redo the study on groups/communities whose basis are fanatical, unscientific beliefs, like: anti-vaccination, religious extremism and flat Earth to name a few. We can also think of repeating this study in other geographies (instead of just the U.S.), and also compare the medium of news delivering. For example, people maybe are more likely to read a news story with opinion-challenging information if its presented to them in a physical newspaper vs online news website? This points to a deeper question of, is the Internet making us more opinionated, insular, trapped in our idiosyncratic beliefs and ideologies?

If I have understood it correctly, the participant’s in Garrett’s study complete a post-reading assessment after reading every news story. Given that the participant’s only have 15 minutes to read the stories, it is unclear if the time spent finishing the post-assessment questionnaire was included in these 15 minutes. If the post-assessment was indeed included in the 15 minute reading window, I feel it might bias the post assessment or the choice of the second news story selected. Moreover, it would have been useful to have some statistic about the length of news stories, say the mean and standard deviation of the word-counts. Other than this, I feel it would have been useful to know more about the distribution of age and income in the two subject populations (the author reports the average age and some information about the income). It may also be interesting to analyze the role played by gender, age and income on political opinion as a whole. Overall, I feel the paper presented a very interesting qualitative study for it’s time, a time when users had a lot more control over what they read.

The Science article by Bakshy et. al. presents the quantitative analysis really well and does a good job, explaining the process of media exposure in friendship networks on Facebook. An interesting research question can be to study, how likely are people to share a news story that conflicts with their affiliations/ideology/opinions as compared to one that aligns with their opinions. Another thought/concern is whether the presented results would hold across geographies.

Read More