Facebook and Google have faced criticism surrounding the spread of fake news on their sites, which many have pointed to as a key influencing component in the presidential election. On Monday, both companies announced plans to curb the spread of fake news on their sites by attempting to remove misleading stories from their advertising platforms, The Guardian reports.
Google announced a policy update on Monday in the hopes of removing its advertisements from the hoax sites. "We will restrict ad serving on pages that misrepresent, mis-state, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property,” a representative for Google told Reuters.
Facebook updated the language in its own advertising platform, the Facebook Audience Network. The policy already places a ban on "misleading or illegal” content, but the update makes clear that their target is fake news. “Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance,” a representative for Facebook said in a statement.
Facebook has been at the center of the fake news debate, as many have pointed out that its algorithm, which selects posts based on users's engagement, is a perfect base for the spread of fake news. If Facebook users interact more with fake news, those links will be proliferated through the algorithm. Last week, Mark Zuckerberg denied claims that the spread of fake news on the site could have influenced the election. "I think there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news,” he said.
While Facebook's algorithm seems to provide a space where fake news can go viral - an inventive for hoax sites to keep publishing, separate from revenue - Google faces a diagnostic challenge in tackling fake stories. On Monday, the sites top news link for "final election results" was a fake story by 70News, claiming that Donald Trump had won the popular vote by 700,000 votes, the same margin by which Hillary Clinton won the popular vote in the election.
Despite Facebook publicly denying culpability in the spread of fake news leading up to the election, company officials have been coming together to decide how to tackle the problem in the last week, Buzzfeed News reports. It remains to be seen how the updates in advertising policy will affect the spread of fake news on the sites going forward.