Reading Reflection #4

Analyzing Right-wing YouTube Channels: Hate, Violence and Discrimination

In this paper, YouTube videos and comments were analyzed using a multi-layered approach in order to try to observe trends related to hate, violence, and discrimination. YouTube, a video sharing website, has brought up concerns regarding its potential to be used as a platform for extremist actors to spread partisan and divisive content, which is sometimes untrue. Researchers looked to explore whether popular right-wing YouTube channels exhibited different patterns of hateful, violent, or discriminatory content, compared to baseline channels. They collected 3731 right-wing videos and 5,072,728 corresponding comments, as well as 3942     baseline videos and 12,519590 corresponding comments. The following research questions were asked:

“Is the presence of hateful vocabulary, violent content and discriminatory biases more, less or equally accentuated in right-wing channels?”

“Are, in general, commentators more, less or equally exacerbated than video hosts in an effort to express hate and discrimination?”

A unique aspect of this paper was the three-layered approach. Lexical analysis was used to carry out comparisons of semantic fields of words. Topic analysis was used to gather the prevalent topics in each group. Implicit bias analysis was used to measure implicit biases in the text. The researchers found that the right-wing channels included a higher percentage of negative words such as “aggression, kill, rage, and violence”. The baseline channels were found to include a higher percentage of fields like joy and optimism. High evidence of hate was not found for right-wing or baseline videos. Bothe categories were also found to exhibit a discriminatory bias towards Muslims. Comments were found to have more words related to aggression, rage, and violence. Seventy-five percent of right-wing videos were shown to have more Muslim bias in the captions than the comments.

I liked the paper, although not much of the results surprised me. I struggled to understand what the researchers intended the impact to be. In the conclusion they mentioned that these findings contribute to a better understanding of the behavior of general and right-wing YouTube users. They mentioned in the introduction that there are concerns about YouTube being used as an easy platform to spread hateful, violent, and discriminatory content, but did not elaborate in the conclusion how their work impacts this concern. I think that by knowing the trends that are present, more informed content sharing can be done and steps can be taken by others to avoid encouraging hateful or harmful content sharing.

I was surprised by the method of data collection. The researchers used InfoWars as a seed and selected other channels favored by the founder as the other right-wing videos. I thought that this process could have been done more methodically. In addition, they did not specify why they chose k=300 for the topic analysis. They may have gotten different results for different values of k, and did not explain why they selected this as the best option.

I like the three-layered approach and I like it could be useful for other studies that involve text. Future work could include applying these techniques to twitter or reddit data, or doing a similar study to this but involving other topics.

Leave a Reply

Your email address will not be published. Required fields are marked *