9/12 Reflection #4

Summary

The article “Antisocial Behavior in Online Discussion Communities” describes a study that sought to characterize the behavior and evolution of antisocial web community users over time. The study focused on users from three websites: CNN, Breitbart, and IGN. Users under observation were split into three categories:

  • FBU (high): Users who were eventually banned and had many of their posts deleted.
  • FBU (low): Users who were eventually banned and had fewer posts deleted.
  • NBU: Users who were never banned.

In order to aid their characterization, the authors observed the frequency with which these users posted, the readability of their posts, and the disdain with which the community backlashed. A point of focus was how users in the FBU category had lower quality content from the start of their membership in the community compared to NBUs. FBU content quality and readability saw a noticeable downward trend as time goes on, and these users tended to post more frequently in fewer discussion boards. This type of activity received a greater backlash from the community and eventually led to their banning. A method used to potentially identify antisocial users early on was derived from the data. The authors discovered that users who received a large amount of backlash, many deleted posts, and more quickly deleted posts early on in their membership were extremely likely candidates for banning later on.

Reflection

Overall, the study was one that anyone from my generation would be able to relate with. Observing similar trolling and flaming in my day to day online life, I found it interesting to see how these users can be characterized. This article is helpful to me and my project partner since our current project idea involves a method to use automated moderation and identify/delete posts that veer from the discussion topic. While the article focuses mainly on negatively perceived posts and comments it was intriguing to see how some trolling behavior was positively perceived by the community. I can imagine a right wing user making an unpopular opinion on a left based political site then being chastised by a leftist user. It is not hard to see the community egging on such behavior. It makes me wonder how moderators should truly respond to this situation. If the behavior goes unpunished, then a toxic community is formed where attacks on unpopular opinions are allowed. If the behavior is punished, the website may lose the faith of its users for “supporting” the unpopular opinion and not the opinion of the toxic user who is considered a true member of the community.

Questions

  • Breitbart is considered to be a far-right news source. What observations can be made about leftist users who started out by stating an unaccepted opinion and gradually became more aggressive in their behavior?
  • What observations can be made on those who witness antisocial behavior? Do they become more likely to engage in such behavior themselves? Do their responses to such behavior become more aggressive over time?
  • It was mentioned that users who were unjustly banned returned with much more antisocial behavior. The study claims that certain activity patterns can be used to identify antisocial users early on. Do you think taking such pre-emptive action could lead to unjust banning and thus the creation of new trolls and flamers?

Leave a Reply

Your email address will not be published. Required fields are marked *