Reflection #4 – [02/07/2019] – [Phillip Ngo]

Summary

This paper focuses on the hate and violence activity of far-right YouTube channels and viewers. They used YouTube channels and videos that were related to Alex Jones and InfoWars who are categorized as far-right. As well as a baseline of the top 10 news channels on YouTube. Their analysis included research on the lexicon, topics, and bias of the videos and comments. They concluded that right-wing channels have more niche and targeted content that is both aggressive and violent (specifically the Muslim community).

Reflection

I think this paper is one of those candidates for “Everything is Obvious” where many probably wouldn’t be surprised with the results. But after seeing the data and methodology it’s clear that any reader could learn something new from the many of the results or reflections within the paper. Seeing the sheer number of statistics gives us a great idea of just exactly how much these channels’ behavior gets exacerbated with their interactions. But on the flip side of that, one thing that bothered me was that it seemed the authors chose data and methodologies that would compliment the results they were looking for.

I found their choice for a baseline to be a little odd. For the Right-wing channels they carefully reviewed every single channel in their category but just chose the first top 10 “news” channels for the baseline. The most popular, YouTube Spotlight, really doesn’t have much news (or anything politically related). A quick glance at their recent uploaded shows videos like the YouTube Rewind, fashion videos, and music videos. Even though the baseline channels aren’t intended to represent “neutral” users I’m not sure what they are supposed to represent.

Another thing I noted is that this paper also was similar to the first reflection read in which there is a lot of data dumping happening. They seemed to over explain the details and methods they used to an extent that went over my head. But also with that said, the amount of analysis that went into the research is astounding. They were able to pull out and compare the captions and comments to a greater extent than I could imagine. One example of this are the different terms under the negative and positive showing the aggression vs anger vs disgust percentage difference between their categories.

Regardless of some of the flaws, there definitely could be a ton of future work in this domain:

  • Instead of this baseline, have different categories like the far-left, left, right, and neutrally perceived channels and see how they differ from each other.
  • They also suggested a temporal look at the channels, but to add to it we could look at the extent to which hate and violence propagates or grows as channel gets more viewers and subscribers.
  • Can we gauge also the reactions to these videos and comments? For this it might be useful to take a look at the number of dislikes or reports on these channels as well as banned users.

Leave a Reply

Your email address will not be published. Required fields are marked *