Facebook Delayed Action as Anti-Vaccine Comments Swarmed Users, Watchdog Group Says
(WASHINGTON) — In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook employees thought they had found a way to help.
By altering how posts about vaccines are ranked in people’s newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about COVID-19 vaccines and offer users posts from legitimate sources like the World Health Organization.
“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote, responding to the internal memo about the study.
Facebook decided to rescind some ideas from the study. Other changes weren’t made until April.
Another Facebook researcher proposed disabling vaccine comments in March so that the platform can better tackle anti-vaccine messages. This suggestion was rejected.
Critics say the reason Facebook was slow to take action on the ideas is simple: The tech giant worried it might impact the company’s profits.
“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”
Continue reading: The 5 Most Important Revelations From the ‘Facebook Papers’
In an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.
Facebook’s internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The Congress received the redacted versions from a group of news agencies, which included The Associated Press.
A treasure trove of documents reveals that Facebook investigated the spread of misinformation concerning life-saving vaccines in the middle of the COVID-19 epidemic. These documents reveal that Facebook’s rank and file employees frequently suggested countermeasures to anti-vaccine posts on their site. The Wall Street Journal reported on some of Facebook’s efforts to deal with anti-vaccine comments last month.
Facebook’s response raises questions about whether the company prioritized controversy and division over the health of its users.
“These people are selling fear and outrage,” said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. “It is not a fluke. It is a business model.”
Continue reading: Facebook Cannot Fix Itself
Typically, Facebook ranks posts by engagement — the total number of likes, dislikes, comments, and reshares. This ranking system may be useful for simple topics like dog photos or recipes. But Facebook’s own documents show that when it comes to divisive public health issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement, and doubt.
Facebook researchers modified the ranking system for over 6,000 people in Mexico, Brazil and the Philippines to find ways of reducing vaccine misinformation. These users were not able to see posts on vaccines chosen for their popularity. Instead, they saw posts that had been selected for their reliability.
It was striking to see a 12% drop in claims made by fact-checkers. There was also an 8% increase of content from credible public health organizations like the WHO and U.S. Centers for Disease Control. These users saw a 7 percent decrease in the number of negative interactions.
Employees at the company reacted to the study with exuberance, according to internal exchanges included in the whistleblower’s documents.
“Is there any reason we wouldn’t do this?” one Facebook employee wrote in response to an internal memo outlining how the platform could rein in anti-vaccine content.
Facebook said it did implement many of the study’s findings—but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.
In a statement, company spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation.”
According to the company, it also took some time for them to review and make changes.
Yet the need to act urgently couldn’t have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable—the elderly and sick. Public health officials were also concerned. The first COVID-19 vaccination had been administered to only 10% of the country’s population. According to a survey by The Associated Press-NORC Center for Public Affairs Research, a third thought they might skip the shot altogether.
Despite this, Facebook employees acknowledged they had “no idea” just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. Company research revealed in February that 60% of vaccine post comments were either anti-vaccine, or vaccine reluctant.
“That’s a huge problem and we need to fix it,” the presentation on March 9 read.
Even worse, company employees admitted they didn’t have a handle on catching those comments. And if they did, Facebook didn’t have a policy in place to take the comments down. This free-for-all allowed users to comment on vaccine posts posted by news organizations or humanitarian organisations.
“Our ability to detect (vaccine hesitancy) in comments is bad in English — and basically non-existent elsewhere,” another internal memo posted on March 2 said.
Continue reading: These moms are scientists and doctors. But They’ve Also Taken On Another Job: Fighting COVID-19 Misinformation Online
Derek Beres of Los Angeles, a fitness instructor and author, is seeing anti-vaccine posts thrive on the comments whenever he encourages vaccinations via his Instagram account, which is owned Facebook. Beres and his friends started a podcast last year after seeing conspiracy theories regarding COVID-19, vaccines and other health-related topics swirling around social media.
Earlier this year, when Beres posted a picture of himself receiving the COVID-19 shot, some on social media told him he would likely drop dead in six months’ time.
“The comments section is a dumpster fire for so many people,” Beres said.
Facebook became so hostile to vaccination that prominent health organizations like UNICEF, World Health Organization, and World Health Organization, were encouraging people to get the vaccine. However, these agencies refused to advertise for free that Facebook gave them, according to documents.
A group of Facebook employees got an idea. While Facebook worked out an anti-vaccine plan, some employees thought it would be a good idea to disable comments on posts.
“Very interested in your proposal to remove ALL in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote on March 2.
It was a suggestion that didn’t work.
Mark Zuckerberg, Facebook CEO, stated instead that the company will begin to label posts regarding vaccines as safe.
The move allowed Facebook to continue to get high engagement—and ultimately profit—off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.
“They were trying to find ways to not reduce engagement but at the same time make it look like they were trying to make some moves toward cleaning up the problems that they caused,” he said.
It’s unrealistic to expect a multi-billion-dollar company like Facebook to voluntarily change a system that has proven to be so lucrative, said Dan Brahmy, CEO of Cyabra, an Israeli tech firm that analyzes social media networks and disinformation. Brahmy stated that Facebook may only be forced to change if there are government regulations.
“The reason they didn’t do it is because they didn’t have to,” Brahmy said. “If it hurts the bottom line, it’s undoable.”
Bipartisan legislation in the U.S. Senate would require social media platforms to give users the option of turning off algorithms tech companies use to organize individuals’ newsfeeds.
Continue reading: Frances Haugen talks to European Lawmakers That’s a Big Problem for Facebook
Senator John Thune (R-South Dakota), a sponsor of this bill, requested that Facebook whistleblower Haugen describe the dangers associated with engagement-based ranking in her testimony before Congress earlier in the month.
She said there are other ways of ranking content — for instance, by the quality of the source, or chronologically — that would serve users better. The reason Facebook won’t consider them, she said, is that they would reduce engagement.
“Facebook knows that when they pick out the content … we spend more time on their platform, they make more money,” Haugen said.
Haugen’s leaked documents also reveal that a relatively small number of Facebook’s anti-vaccine users are rewarded with big pageviews under the tech platform’s current ranking system.
Internal Facebook research presented on March 24 warned that most of the “problematic vaccine content” was coming from a handful of areas on the platform. In Facebook communities where vaccine distrust was highest, the report pegged 50% of anti-vaccine pageviews on just 111 — or .016% — of Facebook accounts.
“Top producers are mostly users serially posting (vaccine hesitancy) content to feed,” the research found.
That same day, Center for Countering Digital Hate published a study of social media posts which found just 12 Facebook users responsible for 73% anti-vaccine postings between February 2013 and March 2014. It was a study that Facebook’s leaders in August told the public was “faulty,” despite the internal research published months before that confirmed a small number of accounts drive anti-vaccine sentiment.
An AP-NORC survey earlier this month found that the majority of Americans blame social media platforms like Facebook and their users for spreading misinformation.
But Ahmed said Facebook shouldn’t just shoulder blame for that problem.
“Facebook has taken decisions which have led to people receiving misinformation which caused them to die,” Ahmed said. “At this point, there should be a murder investigation.”
Seitz reported out of Columbus, Ohio.