Last week there were two (unintentionally) related pieces in the New York Times (NYT) that addressed the issue of managing comments in online communities:
- In Readers With Plenty to Say, Public Editor Arthur Brisbane wrote a column describing the NYT's approach to publishing reader comments online, and the challenges that approach creates.
- In Facebook Wrestles With Free Speech and Civility, Miguel Helft described Facebook's ongoing struggles to maintain the proper balance between allowing open discourse and protecting people from harm.
Both pieces generated a fair bit of commentary (268 comments for the Brisbane column and 88 for the Facebook article), including mine.
On Tuesday I received an email message from Mr. Brisbane asking permission to include an excerpt from my comment on his piece in a follow-up column, Keeping Up With the Commentariat. Of course I agreed!
Since this is an important issue that many organizational leaders have concerns about, I thought a blog post sharing the articles, as well as my perspective, was in order.
Below I include links to each piece and reposts of my comments. I encourage everyone to read ALL of the comments. I think you'll find you learn as much from them as you do from the original articles, maybe more.
In the final section of this post I include a few more thoughts on the subject of community management and comment moderation. I encourage you to share your ideas as well.
My comment was #163. An abbreviated version of point #3 was used in the follow-up piece.
Reading the article and some of the comments, I was struck by the 1.0 approach to a 2.0 reality. Here's what I would advise if you were my client:
- Debalkanize your approaches to moderation. For most readers there is just one NYT, and that stakeholder perspective should trump the perspectives of your internal stakeholders. At a minimum, there should be consistency in the approaches you use within certain features (e.g., columns, news articles, opinions).
- Establish a set of simple but comprehensive posting guidelines and make sure they're clearly visible and available. These rules should include things like how long comments are open and other issues that have been causing confusion.
- Don't allow anonymous posting. Make readers take responsibility for their words and ideas by clearly identifying themselves. The law may eventually take care of this, but in the meantime you can establish a policy.
- Don't pre-moderate comments. Either through using technology or human effort, moderate comments AFTER they've been posted, and delete only those that are clear violations of the rules. Let people take individual responsibility for everything else, and let the collective conversation determine the quality of the comments through their recommendations and responses, as well as their ability to report items as inappropriate.
As other readers have noted, you seem to have some understanding of the problems, but you haven't yet devised a sound solution. Perhaps if you crowdsource the responses to this column you'll be able to develop an approach that makes sense. And remember: this is a happy problem. Imagine if no one wanted to comment on the digital pieces you publish...
My comment was #88. And apparently I got the last word!
What struck me as I read the article, as well as some of the comments, is the sense of "passive entitlement" that many people (ironically) have. Like the federal government, people love to hate FB for having too much power and control, while simultaneously expecting them to perfectly manage a world comprised of over a half billion members. In any civilized community, digital or otherwise, policing is the responsibility of ALL of us. The other day, for instance, a group I follow posted an image I found highly offensive, and I commented on it. Though my perspective was not shared by other commenters, I expect other folks were offended as well. The page owner stood by her editorial rights, but she also changed the image on their website and may have even deleted the item from their wall. I don't know if the image would have violated FB's terms of service, but they didn't have to get involved. We resolved the matter on our own.
I was impressed to learn how FB is approaching this delicate issue. I agree they shouldn't suppress people's rights to express themselves, even if others may find their expression offensive, as long as it doesn't violate their terms of service. Is there gray area? Absolutely. But I'd rather have them err on the side of allowing too much expression rather than too little.
My comment was #21. As of this writing (Monday evening) there were 60 comments. To avoid redundancy with the preceding comments, I am only including an excerpt below.
Thanks for including an excerpt from my comment on last week's column in this week's follow-up. I'd like to reiterate part of what I said there in response to "Let the Readers Pitch In:"
I agree with the notion of allowing readers to share editorial responsibility, but I don't think their involvement needs to be formalized - even on a volunteer basis. A key element of social media is that communities can (and should) police themselves, based on the formal rules and informal norms they've established. With that in mind, I suggested... (I repeated point #4 from my original comment and referred to the Facebook article and my comments there).
A Few More Thoughts
In Part 3 of the SMinOrgs Social Media Primer, two of the mental shifts I suggest leaders need to make relate to recognizing changes in the balance of power and loss of control. Leaders who are able to make these shifts can actually benefit from opening up their communities and platforms to comments from members and other stakeholders.
Although it may be implicit in my comments above, I think it's also important for Community Managers to remember:
- In defining posting rules, less is more. The emphasis should be on basic rules of civility and respect for others. Typical guidelines prohibit things like cursing, personal attacks, name calling, threats of violence, off-topic comments, and spam.
- In addition to being fairly simple and clearly available, posting rules should be applied consistently. The simpler the rules are, the easier that is to do.
- The consequences of violating the rules should be clearly defined and applied consistently and without hesitation. Don't pull your punches.
- Sensitive subjects are unavoidable. Rather than not raising or allowing them for fear of the conversation, it's better to provide people with a civilized forum for addressing them. If they feel strongly enough about an issue, they'll find a way to express themselves - and if the issue relates to your organization/brand, you're better off allowing those conversations to take place on your digital turf.
- Thought it's important to regularly monitor and moderate content, be careful about deleting what might be considered inappropriate comments. As long as something doesn't violate the posting rules, it should stay.
- Make sure staff are properly trained, particularly with respect to handling conflict and crises. Among other things, they should know what comments to respond to and which to ignore, as well the most diplomatic ways to correct factual errors, express empathy, and move conversations from the public arena to a private one when necessary.
- If you're going to offer tools for members of the community to provide feedback on each other's comments, consider creating guidelines for them as well. Doing so reminds them that this is a responsibility they should take serously. It also provides you with the ability to apply negative consequences when someone abuses this privilege. (I'm indebted to commenter #53 from the Keeping Up With the Commentariat for reminding me of the importance of this point.)
Finally, from an organizational perspective, the legal and policy issues can't be emphasized enough. I'm planning to write a white paper addressing these issues in greater detail, but for now the three key points for organizational leaders to remember are:
- You must create a social media policy and/or revise an existing telecommunications policy to reflect new digital technologies and how they might be used.
- Remember that - however it's defined - a social media policy is part of a larger set of employee policies that can be impacted by digital technology. You need to evaluate your entire handbook to determine whether and how other policies need to be updated as well. In conducting this review, it's critical to balance legal and business perspectives and devise an approach that also balances employer and employee interests.
- Once the necessary policies are created/updated, training is critical - not just for employees, but also for managers who have unique responsibilities and concerns they'll have to address. This training should provide information on the tools and technologies and how they can be used, as well as guidance on individual and organizational rights and responsibilities. And given how much both technology and the legal aspects change (primarily through case law), this training should be updated regularly and provided on an ongoing basis.
As always, I welcome your comments, particularly in terms of something I might have missed.