开发者

How to handle flagged content in a community? [closed]

开发者 https://www.devze.com 2023-01-25 14:10 出处:网络
Closed. This question is opinion-based. It is not currently accepting answers. Want to improve this question? Update the question so it can be answered with facts and citations by editing
Closed. This question is opinion-based. It is not currently accepting answers.

Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.

Closed 8 years ago.

Improve this question

On a multi-lingual community with almost only user-generated content, is there a commonly used way to treat flagged content (profanity, racism, general illegal stuff etc)?

As there will be a lot non-english content, the only way to handle the flagging itself is crowdsourcing by the community itself and somehow automaticly hide/d开发者_开发知识库elete the flagged stuff at a threshold. But what method could be used to stop abuse? e.g. "I don't like him, lets all report this and get it deleted"


FIrst of all, it depends on your content.

But in general, I would start by hide/delete the flagged stuff at a threshold.

When the community grows I would add crowdsourcing and create a balance from both.

I would also do a general scan on all posts to search for keywords which might lead or contain bad content.

Also, you will need to create some tolerance as some posts might contain a reference to illegal stuff but intended for god reasons.

ex: dont take drugs

If the community builds well, I would mostly rely on it.


Another option you might consider is to allow your users to "hide" other users, i.e. not see the content of hidden users.

This allows people to "remove" other users that they don't feel contribute to the community.

You could also allow users to report bad posts, and allow a human to decide whether or not to hide or delete the post. You would have to have community rules for this to be effective.

0

精彩评论

暂无评论...
验证码 换一张
取 消