Facebook is fed up and tired of your trash content, and by the way, it’s finally going to do something about it.
In a blog post on January 23, Mark Zuckerberg’s advertising giant announced two upcoming changes to the service we all know and love. First, from January 24, the administrators of the page will have access to a new tab that shows which of its contents has been deleted or degraded by Facebook. Secondly, and much more interesting, in the coming weeks Facebook intends to reduce the threshold for the preventive boot of its garbage from its platform.
In a touch of transparency, the company explained that it will not necessarily wait for Pages and Groups to violate its community standards before deleting them from Facebook. Instead, the company is moving towards a much more proactive approach, say, to identify what can and can not stay on its servers.
When we remove a page or group for violating our policies, now we can also delete other pages and groups, even if that specific page or group has not reached the threshold to not be published on its own, “explains the publication of the blog.” To enforce this updated policy, we will see a broad set of information, even if the page has the same people who administer it [like a previously deleted page] or if it has a name similar to the one we’re deleting. ”
In other words, if Facebook considers that Trash Page X violates the standards of its community and decides to eliminate it, it can go ahead and eliminate other Pages with similar names, even if they do not violate any specific policy.
We contacted Facebook to confirm that this means that pages managed by completely separate people could be disconnected because another person’s page violated the company’s community standards, but did not receive a response to that specific question at the time of publication.
As the new policy is read, it seems that Facebook is fed up with having to point out specific violations of community norms each time it removes a Group or Page from its platform. This new approach allows the company a little more discretion when it comes to banishing content.