Attacks on Media

Trolling is stoppable

17 Mar, 2017

Understanding trolling behaviour holds to clue to stopping it.

Trolling and online abuse is today an ugly reality of the online space. The phenomenal opening up of opportunities to reach out and disseminate news and opinions online is accompanied by an unpleasant aspect: trolls. Women in general and journalists in particular are specifically targeted by trolls who sometimes succeed in enforcing self-censorship or even pushing women offline by their abusive online behaviour. Trolling –abusive, inflammatory or provocative comment– has generally been regarded as the behaviour of a few anti-social, right-wing, misogynist, casteist or racist individuals who take advantage of the ease of technology and the anonymity that the online space offers. 

However, recent research by Justin Cheng et al suggests that it is not just sociopaths, but even ordinary people can engage in such online behaviour. The research was conducted by analyzing 16 million comments made on and conducting an online controlled experiment. They propose two primary trigger mechanisms to trolling: the individual’s mood which was influenced by the time of day (more trolling was found to happen at night); and the surrounding context of a discussion, for example exposure to prior trolling comments in a discussion. In other words, someone who is trolling encourages others to do so. The research found that trolling is “situational” and ordinary people can be influenced to troll. If left unchecked, the trolling can propagate and fast become a community norm.

While it is sobering to read evidence that you or I could also troll, given the right (or wrong!) circumstances, the research provides clues to checking trolling in its tracks and creating safe online spaces. “By understanding what leads to trolling, we can now better predict when trolling is likely to happen. This can let us identify potentially contentious discussions ahead of time and pre-emptively alert moderators, who can then intervene in these aggressive situations,” says the researchers. They suggest that machine learning algorithms can sort through millions of posts faster than any human moderators. Computers set up to spot trolling behaviour can identify and filter undesirable content with much greater speed, thus removing a bank of trolling comments for others to feed off or be inspired by.

Additionally, social interventions can also reduce trolling, say the researchers. “ If we allow people to retract recently posted comments, then we may be able to minimize regret from posting in the heat of the moment. Altering the context of a discussion, by prioritizing constructive comments, can increase the perception of civility,” they suggest. Interestingly, a small step like pinning a post about a community’s rules to the top of discussion pages helps, as a recent experiment conducted on Reddit showed.

Understanding that every user – you and I included – is responsible for inspiring as well as vicious conversations in the digital space is key to having more productive online discussions.

See a brief article here and the full research here.

Written By