Reading Reflection #4 – [2/07/2019] – [Sourav Panth]

Analyzing Right-wing YouTube Channels: Hate, Violence and Discrimination

Summary:

The basis of this paper was to observe issues related to hate, violence and discriminatory bias in a data set containing more than 7000 videos and 17 million comments. They compared right wing channels to the baseline set using a three layered approach. This approach included analyzing lexicon, topics and implicit biases present in the text to find differences between users comments and video content. Their results show that right-wing channels tend to contain a higher degree of words from “negative” semantic fields, raise more topics related to war and terrorism, and demonstrate more discriminatory bias against Muslims in videos and towards LGBT people in comments.

Reflection:

Something that I found very interesting was that the paper talks about YouTube’s recommendation algorithm not being neutral during the presidential election of 2016 in the United States of America. This is actually very surprising to me because I thought YouTube would be very neutral since it is majority user submitted videos. I figured the recommendation algorithm would rely entirely on past videos that you watched, not accounting for YouTube to have a say in what shows up in the sidebar. Something interesting to look at would be the peak of videos recommended by YouTube versus similar videos that were not fed through the algorithm.

One possible source of inaccuracy is that they only used 15 categories to associate comments to negative feelings and 5 categories to associate comments to positive feelings. While I think this is probably fine to start, it seems like the results would be skewed towards the negative category with the current category set. Something that would concern me is that comments may be directed at the YouTube video and not just spewing hate in general. The author also talks about how the right wing raises more topics related to war and terrorism which isn’t necessarily a bad thing. Discussions about war and terrorism don’t automatically mean hate or discrimination and I think with a smaller list of categories some comments made be misidentified. In the future it may help to add more words for association and even out the number of categories in both the negative and positive sections.

Future Work:

There are a couple ways that you could go about improving upon the research already done. First I think it be interesting to see a similar analysis done on left-wing YouTube channels and comparing what the hateful topics of the left-wing is compared to that of the right-wing. Second, I would check the number of right wing videos recommended by YouTube versus the number of left-wing and neutral videos recommended by YouTube. This would help to show the bias of YouTube and also by looking at views you could see what kind of influence YouTube has on its audience. Finally one of the last things that I would improve to get better results is to increase the number of categories used to find comments associated with negative or positive reactions. In the paper they use 15 categories related negative feelings and 5 categories related to positive feelings. Looking through the words that they used, the association seemed very narrow. I think you’re using a larger number of categories to relate these feelings could result in more accurate sentiment analysis. Another thing to consider is if these comments that exhibit anger are directed at a certain group of people or if they’re directed at the YouTube video and creator.

Leave a Reply

Your email address will not be published. Required fields are marked *