Facebook announced on Thursday measures to combat the dissemination of false news for its huge social network, focusing on the most toxic messages and using a committee of experts to distinguish between legitimate news and obvious falsehoods.
First of all, Facebook will provide more users to report any content deemed unlikely; now it can be done in two steps and not in three. If enough users report a particular article, Facebook will refer it to an association of expert organizations linked to the Poynter Journalism Institute.
Those organizations for now are The Associated Press, ABC News, FactCheck.org, Politifact and Snopes. Facebook announced that the set could be expanded.
Fake news can cover a wide range of topics, such as cures nonexistent of the cancer or testimonials of having seen Snowman. But those who have to do with the politics have caused fuss in recent times before the suspicion that they influenced the perception of the public and perhaps there were a factor in the American elections.
In addition, some false news have had unfortunate consequences in the real world: a man was carried away by a lie that were abusing children in a pizzeria in Washington and went to the place, where he fired his rifle.
“Certainly we believe we have an obligation to combat the dissemination of false news”, John Hegeman, vice president of production and Facebook news said in an interview. But he clarified that the company also takes seriously its responsibility to give everyone the opportunity to express themselves, and that it is not their role to determine what is true and what is false.
Articles that are confirmed as falsehoods will not be deleted from Facebook, but will be the announcement that “is in dispute” and will stay further down in the news chain of other users.
Readers can click on the ad and read the details of why it was decided that the claim is alien to reality. And if anyhow one wants to share this content, he can do it, but the contents will be accompanied by a warning.
By associating with respected journalistic organizations and by pointing out, instead of removing, false content, Facebook is evading complaints that others had raised about what right it has to make such decisions. For example, many complained that Facebook was going to become a censor, and perhaps a rather censor awkward since most of its employees are engineers who have little experience in deciding ethics issues.
“Definitely they do not have the necessary experience”, said Robyn Caplan, a researcher at Data & Society, a non-profit academic institute funded in part by Microsoft and the National Science Foundation. In an interview before the announcement made by Facebook, Caplan urged the company to “ask for help from journalism professionals and related organizations who deal with these issues”.