[1] Garett, R. Kelly (2009) – “Echo chambers online? Politically motivated selective exposure among Internet news users”
[2] Resnick, Paul (2013) – “Bursting Your (Filter) Bubble: Strategies for Promoting Diverse Exposure”- Proceedings of CSCW ’13 Companion, Feb. 2013.
Summary
The topic for todays’ discussion is Polarization and selective exposure which means people are exposed to a certain kind of news and they are unaware of what is going on beyond that the “exposure bubble”. Selective exposures offered by a filter bubble is a result of personalized search avoiding diversified and conflicting information. These papers conduct surveys among the population using their political beliefs. The previous researches reveal that the internet news users mostly read and watch the news which conforms with their political belief (opinion reinforcement) rather than opinions which challenge their political beliefs. However, peoples use of internet to acquire political knowledge has raised concerns regarding selective exposure problem. The authors conducted a survey – a web administered behavior tracking study to access how the ones political attitude influences their use of the online news. The study used 5 hypotheses (H1 to H4) based on existing research on exposure processes. The study found that people are more inclined to look and spend more time to read new stories containing more opinion reinforcing information and that they are less likely to look and to be influenced by the opinion challenging information. The last hypothesis concluded that a person will spend more time to observe a story contradicting or challenging their opinion. The reason being the greater the length of exposure in the news the greater opportunity for the person to refuse to view this news in the future.
Reflection
We were first exposed to the filter bubble effect in the paper by Hannak et al. [3]. This is a user getting exposed to information which is personalized due to their search history and might be as a result of algorithmic bias. Echo chambers scenario is mainly because of the fact of web personalization. A user is exposed to beliefs and news which are relevant to their search history and de facto their personal beliefs. It was interesting to learn that when it comes to political domain, people also spend time researching politically challenging data in order to critique the opposite idea further. Thus, being exposed to varied idea gives rise to breaking of the filter bubble, however they are not likely to alter their belief. The need for algorithmic audit is becoming clear to me.
Another important problem that may arise from selective exposure that may have a much larger negative impact is the health sector. Be it beliefs related to vaccination or other health procedures, if people who are against it are exposed only to data to support their hypothesis that may have a severe consequence. Similar to the example provided in the text, even if these group are exposed to theories contrary to their beliefs, they might use the exposure as a form of opinion reinforcement. How can we make sure that along with breaking the filter bubble, the person also becomes open to new thoughts as well?
[3] Hannak, Aniko et al. (2013) – “Measuring Personalisation of Web Search” – Procedings of International World Wide Web Conference Committee. (527-537).