Facebook groups, so far, have played a vital role in helping people connect with peers and friends sharing a common interest. However, there are many instances of breaking Facebook’s terms and policies.
Facebook’s proactive detection tools have the power to detect issues even if it is not reported by users. Artificial intelligence is on the way to take all the credits for doing so by helping Facebook to detect unreported issues and delete it.
Over the last year, over 1.5 million pieces of content in groups were removed by Facebook for violating its policy on the ground of organised hate. 91 % of the 1.5 million pieces were found by Facebook proactively.
On the ground of Facebook’s policy on hate speech, 12 million pieces of content in groups were removed by Facebook. Similar to the previous one, 87% of these contents were located by the AI. For posts within a group, the same treatment is applicable. An entire group can be taken down by Facebook if any rule of conduct gets broken. 1million groups got removed over the last one year for not obeying with the policy of Facebook.
This time, Facebook has decided to take a further step to safeguard its policies. People are restricted from creating further group if they have a case history of violating Facebook’s policy in the past. Admins and moderators of a group, which got removed by the platform, will be prohibited from creating a new group of a similar kind for a specific period of time.
If a particular member of a group is found violating the community standard within the group, then their post in that group will require approval for the next thirty days. For admins who would repeatedly approve post that violates code of conduct, the group will get removed by Facebook.
Along with this, Facebook is trying hard to ensure that all groups have an Active Admin. If an admin leaves, the admin role will be transferred to a member who is interested.
Facebook will start archiving groups in the coming week and groups without an admin for a long time will be marked as an archive. If the remaining admin chooses to leave and no one is ready to act as the group’s admin, the group will be marked as an archive by Facebook.
Facebook has decided to remove health groups from recommendations because it is necessary for people to get health-related suggestions from an authentic source. Although people can still invite a friend to a health group, suggestions for the same will be removed. More information it is suggested to go through Facebook’s recommendation guidelines.
Groups tied to violent organizations and movements are also restricted. Recently, Facebook banned a US-based anti-government network connected to the boogaloo movement. 106 groups were also removed which had a connection with these anti-government networks. Organizations like QAnon and other anarchist groups that actively support violence are also will also get special attention. Such groups will not be recommended by Facebook anymore.
To combat misinformation in the group a “remove, reduce, inform” approach has been introduced. It will involve a global network of independent fact-checkers who will primarily have the functions of keeping an eye on the activity of groups, its content, the activity of moderator or admin, reducing distribution of groups involved in spreading misinformation, informing people of any on-going circulation of misinformation.
Although Facebook has come a long way, there is still scope for improvement which the company has assured in the coming days. The social media platform has promised to update its policy in such a way that the groups remain a medium for communication without hurting the sentiment of any community or particular person.