Facebook increases efforts to clean up groups

Tyrone Stewart

Facebook groups moderationFacebook has said it will now take a more 'transparent' approach to how it deals with misinformation and extremist content spread within groups – whether those groups are hidden or not. The move comes alongside an updated privacy model for groups, which aims to make group settings more straightforward.

Previously, groups on Facebook were able to choose between being ‘public’, ‘closed’, or ‘secret’. To make it clearer what the differences are, Facebook is now using the terms ‘public’ and ‘private’ to describe privacy settings.

Public groups will continue to be visible to anybody, while there are two different types of private group. The first type of private group can be found by anyone but only members can see who’s in the group and what they post – this replaces closed groups. The second can only be found by members – this replaces secret groups.

Within all of these groups, Facebook is using AI and machine learning to proactively detect bad content. Any content flagged by Facebook’s algorithms, or reported by users, is examined by the social network’s team of reviewers to see it actually violates the platform’s community standards.

Facebook also provides group admins with an overview of content that has been removed and flagged to them under the ‘Group Quality’ section. Within this, there is a section about fake news. Admins can now also make use of a section to add group rules and they have the option to share which rule a member broke when declining a pending post, removing a comment, or muting a member.

The added transparency now extends to group members as well. Before joining a group, Facebook users can see relevant details about the group, including who the admins and moderators are, and whether the group has had any other names in the past. Users can also preview groups they are invited to.