Business

Meta Quieter on Election Misinformation as Midterms Loom

WASHINGTON — Facebook owner Meta is quietly curtailing some of the safeguards designed to thwart voting misinformation or foreign interference in U.S. elections as the November midterm vote approaches.

It’s a sharp departure from the social media giant’s multibillion-dollar efforts to enhance the accuracy of posts about U.S. elections and regain trust from lawmakers and the public after their outrage over learning the company had exploited people’s data and allowed falsehoods to overrun its site during the 2016 campaign.

The pivot is raising alarm about Meta’s priorities and about how some might exploit the world’s most popular social media platforms to spread misleading claims, launch fake accounts and rile up partisan extremists.

“They’re not talking about it,” said former Facebook policy director Katie Harbath, now the CEO of the tech and policy firm Anchor Change. “Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They pull back, and we don’t know how that’s going to manifest itself for the midterms on the platforms.”

Learn More Facebook Accused of ‘Whitewashing’ Long-Awaited Human Rights Report on India

Meta banned researchers indefinitely from Facebook after shutting down their investigation into falsehoods amplified by political ads.

CrowdTangle was an online tool offered by the company to hundreds of journalists and researchers. It allowed them to identify trends and misinformation on Facebook and Instagram.

Public communication about the company’s response to election misinformation has gone decidedly quiet. The company issued more than 30 statements between 2018 and 2020 that detailed how it would combat U.S. election fraud, stop foreign adversaries running ads around the vote, and suppress divisive hate speech.

High-ranking executives held question and answer sessions for reporters to discuss new policies. Mark Zuckerberg, the CEO, posted on Facebook false election information. He also wrote opinion articles calling to increase regulations for foreign interference in U.S. election via social media.

Learn More Human Rights Groups Call on Facebook to Drop ‘Racist’ Attempt to Silence Whistleblower

Meta released only a single-page plan for fall elections this year, even though potential threats to voting remain unclear. Many Republican candidates have been making false claims on social media about the U.S. presidential election. Russia and China are continuing to run aggressive social media propaganda campaigns in an effort to further divide American citizens.

Meta states that election security is a top priority, and policies in place to counter foreign interference or misinformation are hardwired into the company’s operations.

“With every election, we incorporate what we’ve learned into new processes and have established channels to share information with the government and our industry partners,” Meta spokesman Tom Reynolds said.

However, he declined to reveal how many people would work full-time on the project in order to safeguard U.S. election results this year.

Learn More Frances Haugen Calls for ‘Solidarity’ With Facebook Content Moderators in Conversation with Whistleblower Daniel Motaung

The company provided tours, photos, and head counts to its election response team war room during the 2018 election cycle. But The New York Times reported the number of Meta employees working on this year’s election had been cut from 300 to 60, a figure Meta disputes.

Reynolds said Meta will pull hundreds of employees who work across 40 of the company’s other teams to monitor the upcoming vote alongside the election team, with its unspecified number of workers.

It is continuing to implement many of the initiatives that it has developed to reduce election misinformation. This includes a fact checking program, which was started in 2016, and which enlists help from news organizations to verify the authenticity of the popular lies being spread on Facebook and Instagram. The Associated Press is part of Meta’s fact-checking program.

Meta has also launched a new feature to allow people to look up details about political ads. This allows them search on Facebook or Instagram for information regarding how advertiser target individuals based their interests.

Learn More BeReal Won’t Save Us From Social Media—Yet

Meta, however has blocked other attempts to spot election misinformation via its websites.

CrowdTangle was a website that it made available to newsrooms all over the globe. It provides insight into trends in social media and has not been updated. The website was used by journalists, fact-checkers, and researchers to analyse Facebook content. They also traced popular misinformation, and who it is.

That tool is now “dying,” former CrowdTangle CEO Brandon Silverman, who left Meta last year, told the Senate Judiciary Committee this spring.

Silverman said to the AP CrowdTangle has been improving search functionality for internet memes. This can sometimes be used to discredit truth-checkers and spread lies.

“There’s no real shortage of ways you can organize this data to make it useful for a lot of different parts of the fact-checking community, newsrooms and broader civil society,” Silverman said.

Learn More Column: We’re Dangerously Close to Giving Big Tech Control Of Our Thoughts

Silverman explained that this transparency was not shared by all at Meta. In the past year, CrowdTangle hasn’t received any updates or new features. It has also experienced long outages over recent months.

Meta stopped all efforts to examine how misinformation is spread through political advertisements.

Two researchers from New York University were denied access to Facebook by the company. They claimed they had collected unauthorised data. Hours after Laura Edelson from NYU said she planned to work with Facebook on investigating the disinformation that was spread around the attacks on the U.S Capitol on Jan. 6, 2021. This attack is currently the focus of an investigation by the House.

“What we found, when we looked closely, is that their systems were probably dangerous for a lot of their users,” Edelson said.

Former and current Meta workers claim, privately, that the exposure of the risks surrounding American elections has created public and political backlash against the company.

Republicans routinely accuse Facebook of unfairly censoring conservatives, some of whom have been kicked off for breaking the company’s rules. Democrats, meanwhile, regularly complain the tech company hasn’t gone far enough to curb disinformation.

“It’s something that’s so politically fraught, they’re more trying to shy away from it than jump in head first.” said Harbath, the former Facebook policy director. “They just see it as a big old pile of headaches.”

However, regulation in the U.S. is no longer an option for the company. Congress has failed to come to a consensus on the oversight that the multibillion-dollar business should be given.

Free from that threat, Meta’s leaders have devoted the company’s time, money and resources to a new project in recent months.

Zuckerberg dived into this massive rebranding and reorganization of Facebook last October, when he changed the company’s name to Meta Platforms Inc. He plans to spend years and billions of dollars evolving his social media platforms into a nascent virtual reality construct called the “metaverse” — sort of like the internet brought to life, rendered in 3D.

Now, his Facebook posts are mainly product announcements. Blog posts from the company announce information about election readiness that are not his own.

In one of Zuckerberg’s posts last October, after an ex-Facebook employee leaked internal documents showing how the platform magnifies hate and misinformation, he defended the company. He reminded his followers, too, that he had worked with Congress to improve regulations regarding elections for the digital age.

“I know it’s frustrating to see the good work we do get mischaracterized, especially for those of you who are making important contributions across safety, integrity, research and product,” he wrote on Oct. 5. “But I believe that over the long term if we keep trying to do what’s right and delivering experiences that improve people’s lives, it will be better for our community and our business.”

It was the last time he discussed the Menlo Park, California-based company’s election work in a public Facebook post.

Technology writer for the Associated Press Barbara OrtutayThis report was contributed by you.

Here are more must-read stories from TIME


Get in touchSend your letters to time@time.com

Tags

Related Articles

Back to top button