Fake news is a real and serious danger that lurks online. Tons of false information and harmful content are published on social media and similar platforms, which then get shared fast and wide, uncontrollably throughout the web.
It doesn’t help that people who get paid to write fake news articles make them look legitimate, so it’s hard to identify what’s real and what’s not. Online platforms don’t seem to prioritize managing spammy content on their end, either, but you can always step up to protect your brand’s online reputation through content moderation.
What Is Content Moderation?
Content moderation is when you apply a set of guidelines to text, images, video, multimedia, and user-generated content (UGC) to give you more control over what type of information you are offering to your audience. Content moderation allows you to filter whatever is not useful, acceptable, or necessary for your brand and the community that you serve.
Why You Should Care About Online Content Moderation
Back in 2016, prior to the U.S. presidential election, as a result of a fake news story claiming that the beverage company’s CEO, Indra Nooyi, had told Donald Trump’s supporters to take their business elsewhere. This only goes to show that content moderation is imperative for businesses, as misinformation can taint a brand’s image and influence public perception. You risk brand boycotts and even platform abandonment if you forgo content moderation.
In fact, 27% of the respondents of Business Insider’s 2019 Digital Trust Survey said that they would stop using a social platform if it continued to allow harmful content. Although Facebook employs approximately 15,000 workers to get rid of inappropriate content, it’s obviously not enough. Businesses need to step up in protecting their brand’s online community.
Companies that take content moderation seriously can enjoy far-reaching benefits, the most obvious of which is that it allows you to keep your site free from anything that doesn’t offer value to your business or your customers.
Moderating content also reduces the risk of visitors seeing content on your channels that may upset or offend them. Customers who feel safe in your platform are less likely to leave your site and instead will be encouraged to interact with your brand and content more. Ultimately, this can boost your site traffic, improve search engine rankings, and establish credibility as a source of useful and positive information.
Content moderation also allows you to promote a culture of peace in the digital space. It prevents cyberbullies and trolls from taking advantage of your brand online and instead allows you to build a community of like-minded individuals who believe in using the internet responsibly.
Types of Content Moderation
There are a few ways to moderate content on your website, but the three key methodologies are:
- Pre-moderation. This involves placing content that has been created by users in a queue so that it can be reviewed prior to publishing. Pre-moderation allows you to screen user submissions for hateful comments or offensive images so they won’t ever be published on your site. Only content that passes community standards is shared with the public.
- Post-moderation. This method of moderation allows all content to be displayed on your site immediately after users submit them, but will still be reviewed and accepted or denied by your community moderator within a short period of time.
- Reactive Moderation. Under this content moderation type, users are the ones responsible for flagging undesirable content. Community members can down-vote, comment on, or report objectionable posts that they come across on the platform, helping moderators swoop down on whatever content goes against the rules.
Best Practices in Content Moderation
1. Set clear community guidelines
To help users and members generate quality content, you need to have clear rules in place. Be specific in defining what type of content, language, or behavior is acceptable and not acceptable for your brand. Discuss the consequences of breaking the rules, too—do not just focus on what sanctions will be applied to the erring member but also how it may affect the entire community.
Make sure that the guidelines are posted where members can easily see them, so they’ll know exactly what to do when they want to start a conversation, post a comment, and so on.
2. Stay visible
Engaging with community members can take on many forms, from liking the post to up-voting, or leaving a comment. It doesn’t really matter what form of interaction you choose to do, as long as you let users know that their contributions don’t go unnoticed
3. Incentivize quality contributions
Some members will be more active in contributing ideas or expressing their sentiments than others. Giving away badges, top fan labels, and similar recognition can go a long way. This kind of reward system is an excellent way of enlivening the community, showing appreciation to engaged users, and encouraging the rest to participate more in group discussions moving forward.
4. Designate a community manager
Community managers or moderators are responsible for keeping the community together. Their goal is to ensure that the community thrives with valuable contributions, appropriate user behavior, and high levels of engagement among group members.
Looking to hire? Make sure you know the skills that make a good community manager.
Online Community Building Through Content Moderation
In a world of fake news and other digital nonsense, content moderation helps protect the integrity and credibility of your business. Content moderation allows you to deliver a positive user experience for people who are looking to use content as a way of building a safe and peaceful online community.
Make content moderation work for your business! Contact our team at Spiralytics to learn more.