Taming the Trolls: An Interview With Web Culture Guru Joseph Reagle

feature story

An in-depth look at trolls, anonymity, and the culture of online commenting.

The internet today is a strange place, and some of its strangest corners are the comment sections—public forums where people are able and encouraged to voice their opinions. While many choose to do so under their own names, others remain anonymous, and their cloak of anonymity allows them to post in a more raw, unfiltered way. The consequence? An online community that is at once democratic and inclusive, but also hostile, counter-productive, and often downright weird.

Few know internet comment sections better than Joseph Reagle, Professor of Communication Studies at Northeastern University and faculty associate at the Berkman Center for Internet and Society at Harvard University. He's the author of Good Faith Collaboration: The Culture of Wikipedia, and the forthcoming Reading the Comments: Likers, Haters, and Manipulators at the Bottom of the Net, due out next year.

Excerpts from our talk with Prof. Reagle appeared in Reviewed.com's weekly column in USA Today last week, but his comments were insightful enough that we feel compelled to share the entire interview.


There seems to be a growing effort to boost civility on the web, with concerted efforts like CiviliNation, and more isolated strategies like the Huffington Post's decision to remove anonymous commenting. Why now?

I previously wrote a book about Wikipedia called Good Faith Collaboration. One of the premises is that there’s an old internet law called Godwin’s Law, which says that the longer a conversation goes on the probability of someone calling someone else a Nazi or Hitler approaches one. And this goes back to Usenet, even before the web, so it’s not a wholly new phenomenon. It’s kind of inevitable that people are going to say nasty things about one another online. Good Faith Collaboration was about how something like Wikipedia could be produced despite this tendency.

Do you feel that anonymity is a fundamental right?

Prof. Joseph Reagle
Prof. Joseph Reagle
I don’t think it’s a fundamental right. I think if people want to post comments they can set up their own blog. Indeed, one of the ironies here is how Dave Winer—who has some claim to being both one of the first bloggers and one of the first bloggers to have a comments section on his website—eventually turned comments off as well. And he turned them off because he basically believed, “you’re supposed to be here to comment on a particular article in a substantive way. If you want to say something stupid or if you want to rant you can go somewhere else.”

How do you feel about recent efforts to either remove anonymity or remove comments sections altogether?

On one hand I do think people have a right to say we don’t want nastiness here. Similarly, people have a right to create a page elsewhere and say whatever they’d like. The question really is, how do you build a community? Because it is very useful to have substantive comments and civil disagreements and varied opinions expressed.

"People say horrible things on Twitter all the time under their own names."
For instance, maybe people sign up with their real name. There’s some evidence that seems to work well, however there’s also plenty of evidence that people do horrible things even under their real identities, perhaps unknowingly. People say horrible things on Twitter all the time under their own names. There was a case in which the feminist website Jezebel—after President Obama was elected—went and collected a series of Tweets from teenagers saying horrible, racist things under their own names. And there’s the Steubenville rape case—a lot of those guys were Tweeting horrible things under their real names. Even if they didn’t have anonymity, maybe they thought that they did.

Couldn’t it also be argued that young people tend to be more reckless?

Absolutely. As to social research, there are lots and lots of theories, over decades, as to why it is that anonymity might prompt people to act horribly—if it is in fact the anonymity and not something else.

You can go back to Plato and the story of Gyges: Imagine a man who has a ring that can make him invisible. In this parable the man uses the ring to seduce the queen, kill the king, and take over the kingdom. So there’s long been this suspicion that if we’re anonymous, we lose accountability and are most likely to act horribly.

"You can go back to Plato and the story of Gyges."

But there’s two things going on from a social sciences point of view: One, does anonymity make good or otherwise decent people behave worse? And that’s the thing most people focus upon. Or two: Are bad or evil people—however you define that—more attracted to commenting and trolling and griefing than normal people? When you look at a thread and you see all this horrible stuff, is this normal people behaving badly, or is this bad people behaving promiscuously? And there’s evidence for both of those, too.

A month ago there was a paper published, "Trolls Just Want to Have Fun?", and they were trying to correlate personality indicators—things like sadism—to a propensity to comment and troll. They found a correlation.

Conversely, people sometimes behave well in anonymous circumstances, which gets back to Wikipedia: How is it you can have this thing where people with their real names, but most often with pseudonyms or even anonymously, contribute to Wikipedia.

So when you get down to it, it seems like a cultural thing.

Yes. For instance, MetaFilter has been around for a long time, and it has a comment culture that has been fairly successful for many years. And they require a one-time fee for people to sign up, and that seems to really work well there.

"Boing Boing had something called a disemvoweler: A moderator could remove all the vowels of a comment that they didn’t like."
Other things that people have tried and gotten rid of: Boing Boing has experimented with turning off comments at various times and coming back with different systems. They had something called a disemvoweler: A moderator could remove all the vowels of a comment that they didn’t like, and the theory was that somehow the abusive commenters would be made to feel ridiculous and leave or not want to participate. It didn’t last long but there’s many, many techniques that people have tried.

And then there are sites like Reddit and Gawker, which are certainly not immune to nasty comments, but they have a voting system where the comments that are the most relevant or the most interesting work their way towards the top. By effect, the trolls tend to be filtered out.

Yes, I think Slashdot really popularized that—they called it Karma—but no system is really perfect. Even with that people found ways to abuse it, not necessarily to say horrible things but to sort of control what’s happening on that site. On Digg there were these things called Bury Brigades, where people would form cabals—which again is a very ancient notion going back to Usenet—of people who would upvote one another and bury other people who they didn’t want to see succeed on Digg. On Slashdot they had this notion called Karma Whoring.

"Almost any system that you deploy is the subject of some type of abuse."

Reputation systems are often a favorite of academics because they think, “we just designed this perfect system.” But almost any system that you deploy is the subject of some type of abuse.

Do you think the nature of online anonymity is changing also in response to recent privacy and surveillance concerns? People seem to be more aware of their online identity than ever.

I haven’t seen any recent research about attitudes to anonymity and privacy, but my suspicion is that all this news is making people feel that their attempts to protect their privacy are useless. I think people feel like it’s just impossible.


Hero Image: Flickr user "dirt3_monster" (CC BY-NC-ND 2.0)