Facebook Bungled Early Management of Anti-Vaccine Content, Ignored Staff, Documents Show


    Facebook Bungled Early Management of Anti-Vaccine Content, Ignored Staff, Documents Show

    As false statements about COVID-19 vaccines circulated online in March, Facebook research employees believed they could help reduce misinformation shared on the platform.

    Researchers planned to alter how posts ranked on users’ news feeds, and would offer posts from authoritative sources like the World Health Organization. They even suggested turning off comments on misleading vaccine posts, but their ideas went nowhere at the start when shared with executives, according to documents obtained by the Associated Press.

    Although company employees reacted to the study with excitement, there was a delay to take action.

    “Is there any reason we wouldn’t do this?” one Facebook employee wrote in response to an internal memo outlining how the platform could rein in anti-vaccine content.

    Facebook ignored the employees and were slow to implement the study’s findings, said an internal document provided by Facebook whistleblower Frances Haugen.

    Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group, supposed the delay was due to the company’s fears of losing profit.

    “Why would you not remove comments? Because engagement is the only thing that matters,” said Ahmed. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”

    When testing their theory early on, Facebook employees changed post ranks for more than 6,000 users in Brazil, Mexico, the Philippines, and the U.S., so users saw vaccine posts based on trustworthiness instead of popularity.

    The results of this study saw a 7 percent decrease in negative interactions on the website, 12 percent decrease in debunked posts, and an 8 percent increase in legitimate public health organizations’ content.

    For more reporting from the Associated Press, see below.

    Fake COVID news
    FILE – In this Sept. 23, 2021, file photo, Oumie Nyassi shows a video circulating on the internet and that has been confirmed as fake news of a woman claiming she was magnetized after receiving the COVID-19 vaccine, in a doctor’s office at Serrekunda, Gambia hospital. Last spring, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action.
    AP Photo/Leo Correa, File

    In a statement, company spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation.”

    The company also said it took time to consider and implement the changes.

    Yet the need to act urgently couldn’t have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable — the elderly and sick. And public health officials were worried. Only 10 percent of the population had received their first dose of a COVID-19 vaccine.

    A third of Americans were thinking about skipping the shot entirely, according to a poll from the Associated Press-NORC Center for Public Affairs Research.

    Despite this, Facebook employees acknowledged they had “no idea” just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. But company research in February found that as much as 60 percent of the comments on vaccine posts were anti-vaccine or vaccine reluctant.

    Published at Tue, 26 Oct 2021 19:25:16 +0000


    Previous articleSelf Esteem, a Pop Singer Who Wants Britain to ‘Prioritise Pleasure’
    Next articleThe New Jersey Governor’s Race Gets Closer under the Radar