Facebook’s New Anti-Fake News Strategy Is Not Going To Work – But Something Else Might
Paul Ralph, 1 May 17
       

Have you seen some “tips to spot fake news” on your Facebook newsfeed recently?

Over the past year, the social media company has been scrutinized for influencing the US presidential election by spreading fake news (propaganda). Obviously, the ability to spread completely made-up stories about politicians trafficking child sex slaves and imaginary terrorist attacks with impunity is bad for democracy and society.

Something had to be done.

Enter Facebook’s new, depressingly incompetent strategy for tackling fake news. The strategy has three, frustratingly ill-considered parts.

New products

The first part of the plan is to build new products to curb the spread of fake news stories. Facebook says it’s trying “to make it easier to report a false news story” and find signs of fake news such as “if reading an article makes people significantly less likely to share it.”

It will then send the story to independent fact checkers. If fake, the story “will get flagged as disputed and there will be a link to a corresponding article explaining why.”

This sounds pretty good, but it won’t work.

If non-experts could tell the difference between real news and fake news (which is doubtful), there would be no fake news problem to begin with.

What’s more, Facebook says: “We cannot become arbiters of truth ourselves — it’s not feasible given our scale, and it’s not our role.” Nonsense.

Facebook is like a megaphone. Normally, if someone says something horrible into the megaphone, it’s not the megaphone company’s fault. But Facebook is a very special kind of megaphone that listens first and then changes the volume.

Facebook is like a megaphone. Enrique Castro-Mendivil/Reuters

The company’s algorithms largely determine both the content and order of your newsfeed. So if Facebook’s algorithms spread some neo-Nazi hate speech far and wide, yes, it is the company’s fault.

Worse yet, even if Facebook accurately labels fake news as contested, it will still affect public discourse through “availability cascades.”

Each time you see the same message repeated from (apparently) different sources, the message seems more believable and reasonable. Bold lies are extremely powerful because repeatedly fact-checking them can actually make people remember them as true.

These effects are exceptionally robust; they cannot be fixed with weak interventions such as public service announcements, which brings us to the second part of Facebook’s strategy: helping people make more informed decisions when they encounter false news.

Helping you help yourself

Facebook is releasing public service announcements and funding the “news integrity initiative” to help “people make informed judgments about the news they read and share online”.

This – also – doesn’t work.

A vast body of research in cognitive psychology concerns correcting systematic errors in reasoning such as failing to perceive propaganda and bias. We have known since the 1980s that simply warning people about their biased perceptions doesn’t work.

Similarly, funding a “news integrity” project sounds great until you realise the company is really talking about critical thinking skills.

Improving critical thinking skills is a key aim of primary, secondary and tertiary education. If four years of university barely improves these skills in students, what will this initiative do? Make some Youtube videos? A fake news FAQ?


Facebook has a depressingly incompetent strategy for tackling fake news. Shailesh Andrade/Reuters

Funding a few research projects and “meetings with industry experts” doesn’t stand a chance to change anything.

Disrupting economic incentives

The third prong of this non-strategy is cracking down on spammers and fake accounts, and making it harder for them to buy advertisements. While this is a good idea, it’s based on the false premise that most fake news comes from shady con artists rather than major news outlets.

You see, “fake news” is Orwellian newspeak — carefully crafted to mean a totally fabricated story from a fringe outlet masquerading as news for financial or political gain. But these stories are the most suspicious and therefore the least worrisome. Bias and lies from public figures, official reports and mainstream news are far more insidious.

And what about astrology, homeopathy, psychics, anti-vaccination messages, climate change denial, intelligent design, miracles, and all the rest of the irrational nonsense bandied about online? What about the vast array of deceptive marketing and stealth advertising that is core to Facebook’s business model?

As of this writing, Facebook doesn’t even have an option to report misleading advertisements.

Sign in to view full article

       
Far Beyond Crime-Ridden Depravity, Darknets Are Key Strongholds of Freedom of Expression Online
The internet is much more than just the publicly available, Google-able web services most online users frequent – and that’s ...
Roderick S. Graham
Wed, 1 Feb 17
How To Build a More Organic Internet (And Stand Up to Corporations)
Internet access has become such a necessary tool for participating in society that it has been declared a “human right” ...
Panayotis Antoniadis
Fri, 3 Feb 17
Gut Check: Researchers Develop Measures to Capture Moral Judgments and Empathy
Imagine picking up the morning newspaper and feeling moral outrage at the latest action taken by the opposing political party. ...
C. Daryl Cameron
Sat, 1 Apr 17
Facebook’s New Anti-Fake News Strategy Is Not Going To Work – But Something Else Might
Over the past year, the social media company has been scrutinized for influencing the US presidential election by spreading fake ...
Paul Ralph
Mon, 1 May 17
Use Your Body, Not WiFi, to Transmit Secure Passwords
Sending a password or secret code over airborne radio waves like WiFi or Bluetooth means anyone can eavesdrop, including hackers.
Jennifer Langston
Fri, 6 Jan 17
An Epoch Times Survey
An Epoch Times Survey
Sports Elements
BUCHERER
Read about Forced Organ Harvesting
Sports Elements