Facebook is announcing many updates today that can reduce the reach of harmful content across its platform. Facebook has said its users are posting more in the social network’s private spaces, including groups and messaging, a shift that could make it tougher for the tech giant to ensure the authenticity of news in its platform.
Facebook today said that they are taking new steps against those groups who are sharing “repeatedly share misinformation” will now be distributed to fewer people in the News Feed. This can come handy in stopping problematic content from going viral. The company said that in the coming weeks it’ll start looking at how administrators and moderators of Facebook groups decide what content to keep up. That’ll help Facebook determine whether a group is violating the social network’s rules.
Facebook is also making some small changes around fact-checking stories. The Associated Press is now going to start fact-checking some videos in the US, and Facebook will start including “Trust Indicators” when users click to see context around a publication. Those indicators come from The Trust Project, a group built by news organizations that make those determinations.
“We can now proactively detect many types of violating content posted in groups before anyone reports them” and sometimes before anyone sees them, Guy Rosen, Facebook’s vice president of integrity, and Tessa Lyons, Facebook’s head of News Feed Integrity, wrote in a blog post.
The social network unveiled a variety of other steps it’s taking to combat fake news, following criticism that its efforts aren’t working well enough.
Well, we only can hope that these features will have a positive impact on the Facebook platform and will stop those fake and problematic news.