Reflection 10 – [10/02] – [Lindah Kotut]

  • Kate Starbird. “Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter” (2017)
  • Mattia Samory and Tanushree Mitra. “Conspiracies Online: User discussions in a Conspiracy Community Following Dramatic Events” (2018)

Brief: The two papers extend the discussion from filter bubbles and echo chambers that are as a result of blackbox platforms and onto crippled epistemology or “intellectual isolation” that are due to the lack of source diversity due to mostly driven by user actions.

Lens: The two papers apply different lenses in looking at alternative theories surrounding dramatic events, Starbird vs Samory: Journalist/alternative media vs user lens, Twitter vs Reddit. While wanted an overall picture of the alternative media ecosystem — who the big hitters were, so followed the network approach, Samory focused on who the users are, how their behaviors both typifies them as conspiracy theorists, and how this behavior develops. Starbird’s work is also (mostly) qualitative while Samory follows quantitative analysis.

Countering Conspiracy
There is a thin line between conspiracy/alternative narrative and weaponized disinformation. The latter is touched in Starbird’s paper in discussing the #PizzaGate narrative and the role that Russian disinformation played in distributing misinformation. It is only when the line has been crossed from free speech to endangering lives that the law machinery comes to play. Before bridging that line, Samory recommends intervention before the line is breached. This is a starting point based on the quantitative analysis on the type of users in the r/conspiracy subcommunity.

This line between free speech and weaponized misiniformation allows us to apply retrospective lens from which to view both work, but especially the recommendations made in Samory’s paper. We use three examples that preceded and followed both papers:

  1. The Fall of Milo Yiannopoulos.
  2. The fall of Alex Jones.
  3. Grassroot campaigns

The three cases allow us to probe for reason: Why do platforms choose to spread misinformation? State actors aside, there other reason is for monetary reason (setting aside the argument that it is easy to churn out massive amount of content if journalistic integrity is not a concern — as notes Starbird). There is money in conflict. Conspiracy is a motherlode of conflict.

Removing the monetary incentive seems to be the way to best counter the spread/flourishing of these platforms. How to do this uniformly and effectively requires the cooperation of the platform and is subject, ironically to the rise shadow banning conspiracy (Twitter/Facebook being blamed for secretly silencing mostly conservative voices).

Why seek veteranship?
I would propose another vector of measurement to best counter theories from veterans: Why do veterans stay and continually post? There is no (overt) monetary compensation that they gain. And even if they are a front of an alternative news source, it does not square the long-game of continually contributing “quality” content to the community — which is counter to the type and volume of “news” from the alternate sources. It is easy to profile joiners and converts as they are mostly swayed by dramatic events, but not veterans. Does they also chase the sense of community with other like-minded people, or the leadership — the clout brought about by the longevity of the posters bring to bear in these decisions? Because these innate, yet almost unmeasurable goals would make it difficult to ‘dethrone’ (for lack of a better term) these veterans from being arbiters of conspiracy theories. The (lack of) turnover of the community’s moderators would also aid in answering these questions.

Read More

Reflection #10 – [10/02] – Subhash Holla H S

“When you are young, you look at television and think, there’s a conspiracy. The networks have conspired to dumb us down. But when you get a little older, you realize that’s not true. The networks are in business to give people exactly what they want.”                                – Steve Jobs

The paper is a good precursor to the research project that Shruthi and I are interested in. I would like to analyze the paper in terms of a few keys points that were mentioned trying to capture the things mentioned in the paper and sharing my own insights and possible explanations for them.

Conspiracy theory:

The paper eludes to this concept of people online and websites, in general, are catering to “conspiracy theories” as that is what has been seen to draw people’s attention.  But what are conspiracy theories? The paper categorizes them under “alternative narratives” not giving a formal definition of what is being considered as “conspiracy theory”. I will define it as “any propagation of misinformation” which I believe is broadly the same meaning the paper talks about as well. A couple of interesting facts that the paper talks about which I feel is worthy of address are:

  • once someone believes in a conspiracy theory of an event, it is extremely hard to dissuade them from this belief“. Here I would like to further substantiate that some opinions and beliefs are ingrained into a person and unless they are given a logical explanation over a long period of time reinforcing the notion that they might be wrong it will be difficult to get a person to change their stance. This effort is a necessary one and I feel painting the picture of where the information is coming from will help negatively reinforce the idea that they are right. If we do not put an effort to do this then “belief in one conspiracy theory correlates with an increased likelihood that an individual will believe in another” will turn out to be true as well.
  • The definition of conspiracy theorists being a part of ‘alternative to “corporate-controlled” media‘ is one I do not agree with. This raises a philosophical debate as to where we draw a line? Should we draw a line or should we look for methods that do not try and draw a line?

Bias:

“first author is a left-leaning individual who receives her news primarily through mainstream sources and who considers that alternative narratives regarding these mass shooting events to be false” was according to me a revelation in the field of human behavioral modeling. Being a part of the Human Factors community and have interacted with many Human Factors professionals this is the first time that I have seen an author explicitly mentioned inherent bias in an effort to eliminate it. Acknowledgment is the first step to elimination. I think the elimination of Bias should follow a similar procedure like the time-tested 12-step model we follow in Addiction recovery.  That could be an interesting study as a model like that could shed some light on my hypothesis that “Humans are addicted to being biased”.

Another point is the use of confusion matrix based on signal detection theory. We could use this to build a conservative or liberal model of a human and then use this generalized model to help design tools to foil the propagation of misinformation and “alternative narratives”.

General Reflection:

In general, I found a couple more observations that resonated with the previous readings of this semester.

The overall referral to misinformation propagation coupled with the video lecture where the author presents an example of Deep Fakes for misinformation propagation, sent me back to this question that I have been asking myself off late. All of the research we have analyzed is on text data. What if the same was video data? Especially in this case as we do get some if not all of our information from YouTube and other such video platforms. Will this research translate directly to that case? Is there existing and/or ongoing research on the same? Is there a need for research on it?

Theme convergence was another concept that really interested me as I would be really interested in understanding how diverse domains converge to common themes. These will help build better group behavioral models and overcome the problem of falling into Simpson’s paradox that researchers fall into, especially when dealing with human data.

PAPER:

K. Starbird, “Examining the Alternative Media Ecosystem through the Production of Alternative Narratives of Mass Shooting Events on Twitter,” Icwsm17, no. Icwsm, pp. 230–239, 2017.

Read More

Reflection 10 – [10/02] – [Vibhav Nanda]

Readings:

[1] Examining the Alternative Media Ecosystem through the Production of Alternative Narratives of Mass Shooting Events on Twitter

Summary:

In this paper the author talks about the ecosystem around fake news/alternative news/conspiracy theories. In order to study such an ecosystem, author took an interpretivist approach to understand the ecosystem — by “blending qualitative, quantitative and visual methods to identify themes and patterns in the data.” The data was collected from Twitter Streaming API, tracking on words that could indicate a shooting — such as gunmen, shooter, shooting etc. — over a 10 month period, resulting in 58M tweets. An extremely high tweet count, for a single topic, was the result of 3 high profile mass shootings — Orlando(FL), Munich(Germany), and Burlington(WA). To extract tweets related to alternative narratives, author used key words like false flag, crisis actor, hoax etc, resulting in 99,474 tweets. After getting the tweets, the author carried out a great deal of classification of the accounts and the domains that the links point to. During their analysis, the author created a network graph to fully understand the alternative news ecosystem. Interestingly enough, the author found thematic similarities between conspiracy theories, and alternative narratives of real life events.

Reflection/Questions:

This was an extremely interesting read for me as conspiracy theories are my guilty pleasure, but only for entertainment reasons. 58M tweets were collected relating to gun shootings, however only 99,474 were identified as being related to alternative news. Seeing how only an extremely small percentage (around 0.17%) were related to conspiracy theories, I would say this is not an epidemic YET. Whilst reading this paper, I started thinking about possible solutions or tools to raise awareness amongst readers; without banning/blacklisting websites/pages/users who indulged into such sort of activities. I came up with the following system.

New Design:

Background: I would say there is a difference between information and news. Information includes both opinions and facts and may or may not be verified; news (in its typical sense) is only facts and is verified from multiple sources. Stemming from this difference, citizens should only be allowed to freely disseminate all the information they want, however they should not be allowed to disseminate any sort of news. Only authenticated citizens should be allowed to disseminate news, we can call them e-journalists. Same goes for websites, and pages on faceboook (and other social media websites). The system I am going to outline, only focuses on websites.

Assumption: User is a male, and the platform that is under discussion is Twitter (can be scaled to other platforms also).

Explanation of the system: The system has multiple facets to it.

A) Each time a user cites an authenticated website as a news source on his tweet, he gets some reward points (for being a responsible citizen). Each time a user cites an unauthenticated website as a news source, he gets penalized for that. If the user ends up having 0 points, he will not be allowed to cite any more unauthenticated websites unless he gains some points by citing an authenticated source first. Let’s call this point system “Karma points”

B) When the user posts a link to an unauthenticated website as a news source, he will get a warning pop up window when he presses the tweet button. The pop up window will let him know that the news source is not authenticated, could include some more discouraging language, and then will have a confirm button — which will allow him to cite that website anyway. When that tweet is finally posted, it will have a warning next to it. The warning will let other users know that this specific tweet has cited an unauthenticated website as its news source. This will discourage the user from citing such websites in the first place.

C) When a different user(the reader) clicks on the unauthenticated website, he will also get a pop warning saying that he is “about to enter a website that is not identified as an authentic news source.” He would have to click the confirm button to move forward. The readers Karma points will remain unaffected.

 

 

Affects of the design:   I believe such a design will caution people when entering an unauthenticated website cited as a news source. It will also dissuade people from sharing such websites that tend to have fake news/ conspiracy theories. The process of dissuasion will come first as a caution (before they post) and then as shame (when their post is marked with a warning label and reduced karma points).

Read More

Reflection 10 – [10/02] – [Prerna Juneja]

Examining the Alternative Media Ecosystem through the Production of Alternative Narratives of Mass Shooting Events on Twitter

Reflection:

In this paper, the author studies the “fake news” phenomenon by generating and studying domain network graphs and qualitatively analysing tweets collected over a ten-month period. She explains the political leanings of the alternate news sources and describes how these websites propagate and shape alternate narratives.

Examples

Global warming is a hoax created by the UN and politically funded scientists/environmentalists, aided by Al Gore and many others to put cap and trade in place for monetary gain

Osama bin Laden is not dead / has been dead for long / is an invention of the American government.”

That ample evidence of alien life and civilization exists in our solar system, but is covered up by NASA.”

These are some of the popular conspiracy theories. Several others can be found on website that is dedicated to listing and categorizing these theories.

What is the need to study conspiracy theories?

Research has shown that if you start believing in one conspiracy theory, there is a high chance that you might start believing in another. “Individuals drawn to these sites out of a concern with the safety of vaccines, for example, may come out with a belief in a Clinton-backed paedophilia ring”, stated Kate Starbird in one of her published articles.

Why do people believe in conspiracy theories and false narratives? How to change their mind?

The author of an article replicated a part an existing study. He conducted a twitter poll asking users if the sequence 0 0 1 1 0 0 1 0 0 1 0 0 1 1 has any pattern. 56% agreed that there is indeed a pattern even though sequence was generated by just randomly flipping a coin.

The author cites the paper “Connecting the dots: how personal need for structure produces false consumer pattern perceptions” [I couldn’t find the pdf version, I just read the abstract] and states that

 One of the reasons why conspiracy theories spring up with such regularity is due to our desire to impose structure on the world and incredible ability to recognise patterns” and

“facts and rational arguments really aren’t very good at altering people’s beliefs”

The same article discusses several approaches and cites several studies on how to convey authentic information and make people change their mind: –

  • Use stories: People engage with narratives much more strongly than with argumentative or descriptive dialogues.
  • Don’t mention the myths while making your point since it has been seen that myths are better remembered than facts.
  • While debunking the fake theories, offer explanations that resonate with people’s pre-existing beliefs. For example, conservative climate-change deniers are much more likely to shift their views if they are also presented with the pro-environment business opportunities.

Use of Bots

One of the things I noticed in the paper is the use of bots to spread the conspiracy theories and misinformation. It seems that during major events, the bot activity increases manifold. I found two studies analysing bot activity (not just limited to spread of conspiracy theories) “NEWS BOTS Automating news and information dissemination on Twitter”[1] and “Broker Bots: Analysing automated activity during High Impact Events on Twitter”[2].

More diverse events

The data in the study was limited to shooting events. The analysis could extend to other high impact events like natural disasters, elections, policy changes and find out the similarities if any in the sources spreading the misinformation and the misinformation itself.

Influence of multiple online communities

Certain subreddits (the_donald) and 4chan (/pol/) communities have been accused of generating and disseminating alternative narratives and conspiracy theories. What is the role of these communities in spreading the rumours? Who participates in these discussions? And how are users influenced by these communities?

Identify conspiracy theories and fake news

How do rumors originate? How do they propagate in an ecosystem? What is the lifespan of these rumors?

I feel an important step to identify conspiracy theories is to study how the language and structure of the articles coming from alternate sources differ from those of the mainstream ones. Not just the articles but also the carriers i.e the posts/tweets sharing them. How is the story in these articles weaved and supported? We saw an example in the paper where Sandy Hook shootings (2012) were referenced in Orlando shooting (2016) as an evidence to support the alternate narrative. What other sources are used to substantiate their claims? Authors in paper “How People Weave Online Information Into Pseudoknowledge” find out that people draw from a wealth of information sources to substantiate and enrich their false narrative, including mainstream media, scholarly work, popular fiction, and other false narratives.

Artcles/videos related to conspiracy theories among top search results

A simple search for “vaccination” in YouTube gives atleast 3-4 anti-vaccination videos in the top 30 results where people share theories/reasons about why vaccines are not safe (link1, link2). More conspiratorial content in the top results will expose more people to fake stories which might end up influencing them.

Read More

Reflection 10 – [10/02] – [Shruti Phadke]

Paper 1: Starbird, Kate. “Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” ICWSM. 2017.

Image result for conspiracist cartoon

Startbird et. al’s paper qualitatively analyzes sources of news related to shooting events on twitter based on the interpretive graph analysis. She analyzes how alternative sources and mainstream sources are referred in different contexts when it comes to conspiracy theories. That is, either in support or to dispute the claims. The paper also dwells on how such citing of alternative news sources can foil politically motivated conspiratorial thinking.

One common theme that is present throughout the paper, is how all of the events are blamed on one central “untouchable” entity such as U.S government or a secret organization. This comes from a very common conspiratorial trait according to which “everything needs to have a reason”. It is found that conspiracists try to relive the event as an aftermath of that event by “rationalizing” it.[1] Further, such theories go on to giving increased importance to certain agents such as FBI, government, Illuminati etc. The paper mentions that 44 out of 80 sources were promoting political agenda. It would be interesting to know which agents such source frequently target and how do they tie such agents to multiple events. 

The paper makes another excellent point that “exposure to online media correlates with distrust of mainstream media”.  Considering that the mainstream media is correcting conspiratorial claims or presenting the neutral narrative, it will be interesting to do a contrast study in which network of users “bashing” mainstream media can be found. One important thing to note here is that just by text analysis methods, it is difficult to understand the context in which the source of information is cited. This reflects in the three ways mentioned in the paper by which alternative media promotes alternative narratives. 1. They cite alternative sources to support the alternate narrative or 2. They cite mainstream sources in a confrontational way. This is where the quantitative approaches are tricky to use because clearly, just the presence of a link in a post doesn’t tell much about the argument made about it. Similarly, hand-coding techniques are also limiting because analyzing the context, source and narrative takes a long time and results in high quality but smaller dataset. One possible way to automate this process can be to perform “Entity Sentiment Analysis” that combines both entity analysis and sentiment analysis and attempts to determine the sentiment (positive or negative) expressed about entities within the text. Treating the sources cited as “proxy” entities, it can be possible to find out what is being said about them in the positive or negative light.

The paper also mentions that believing in one conspiracy theory makes a person more likely to believe another. This, alongwith the cluster of sources supporting alternative narratives domains in the figure 2 can form a basis for quantitatively analyzing how communities unify. [2]

Lastly, as a further research point, it is also interesting to analyze when a particular alternate narrative gets popular. Why some theories take hold while many more do not? Is it because of the informational pressure or because of the particular event. One starting point for this kind of analysis is [3] which says that when the external event threatens to influence users directly, they explore content outside their filter bubble. This will require retrospective analysis of posting behavior before and after a specific event considering users which are either in geographical, racial any other ideological proximity of the group affected by that event.

 

 

[1] Sunstein, C. R., & Vermeule, A. (2009). Conspiracy Theories: Causes and Cures. Journal of Political Philosophy, 17(2), 202–227. https://doi.org/10.1111/j.1467-9760.2008.00325.x

[2]Science vs Conspiracy: Collective Narratives in the Age of Misinformation. PLOS ONE, 10(2), e0118093. https://doi.org/10.1371/journal.pone.0118093

[3] Koutra, D., Bennett, P., & Horvitz, E. (2015). Events and Controversies: Influences of a Shocking News Event on Information Seeking. TAIA Workshop in SIGIR, 0–3. https://doi.org/10.1145/2736277.2741099

Read More