Facebook Papers Show Deep Conflict Within
Facebook the company is losing control of Facebook the product—not to mention the last shreds of its carefully crafted, decade-old image as a benevolent company just wanting to connect the world.
A former employee provided thousands of pages of documents to Congress. They show an internal conflicted organization where information on the damage it does is plentiful, but the solutions are lacking.
Documents reveal how Facebook appears, despite its often avowed positive intentions, to have slowed-walked or ignored efforts to correct real harms it has sometimes caused and magnified. The documents show numerous examples where rank-and-file employees and researchers discovered deeply-rooted issues that Facebook ignored or neglected.
The CEO Mark Zuckerberg is ultimately responsible for the current state of affairs. One former employee said that Zuckerberg holds dictatorial control over a company that gathers and offers free services to around 3 billion people worldwide.
“Ultimately, it rests with Mark and whatever his prerogative is—and it has always been to grow, to increase his power and his reach,” said Jennifer Grygiel, a Syracuse University communications professor who’s followed Facebook closely for years.
Continue reading: What Facebook did to force a reckoning: It shut down the team that placed people before profits
Zuckerberg holds a firm grip on Facebook Inc. He holds the majority of the company’s voting shares, controls its board of directors and has increasingly surrounded himself with executives who don’t appear to question his vision.
He has not been able to stop the stagnant user growth and the shrinking engagement of Facebook in important areas like the United States or Europe. Worse, the company is losing the attention of its most important demographic—teenagers and young people—with no clear path to gaining it back, its own documents reveal.
Young adults engage with Facebook far less than their older cohorts, seeing it as an “outdated network” with “irrelevant content” that provides limited value for them, according to a November 2020 internal document. It is “boring, misleading and negative,” they say.
So, in other words: Facebook is a platform for the old.
Facebook’s user base has been aging faster, on average, than the general population, the company’s researchers found. Facebook must find ways to reverse this trend or its users will grow older faster than the general population. Young people won’t be able to log on as much, which could threaten their monthly user numbers, essential for selling advertising. Facebook says its products are still widely used by teens, although it acknowledges there’s “tough competition” from TikTok, Snapchat and the like.
Facebook has been pushing for high user growth in countries other than the United States and Western Europe to ensure it continues its expansion and power. Facebook failed to recognize or anticipate unintended consequences when it began to expand into new areas of the globe. It also signed up millions of users but did not provide the staff and systems necessary to stop hate speech, misinformation, and violence.
In Afghanistan and Myanmar, for instance, extremist language has flourished due to a systemic lack of language support for content moderation, whether that’s human or artificial intelligence-driven. In Myanmar, it has been linked to atrocities committed against the country’s minority Rohingya Muslim population.
Continue reading: Facebook Banned a Hindu Extremist Group—Then Left Most of Its Pages Online for Months
Facebook is unable to even acknowledge the true-world collateral damages that come with its rapid growth. These harms include the use of shadowy algorithms to radicalize users and facilitate human trafficking and teen suicide.
Internal efforts to mitigate such problems have often been pushed aside or abandoned when solutions conflict with growth — and, by extension, profit.
The company was backed into a corner by hard evidence from leaked documents and has redoubled its efforts to defend its decisions rather than trying to solve its problems.
“We do not and we have not prioritized engagement over safety,” Monika Bickert, Facebook’s head of global policy management, told The Associated Press this month following congressional testimony from whistleblower and former Facebook employee Frances Haugen. In the days since Haugen’s testimony and appearance on “60 Minutes”—during which Zuckerberg posted a video of himself sailing with his wife Priscilla Chan—Facebook has tried to discredit Haugen by repeatedly pointing out that she didn’t directly work on many of the problems she revealed.
Continue reading: The Facebook Whistleblower Hearing: Four big takeaways
“A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us,” Facebook tweeted from its public relations “newsroom” account earlier this month, following the company’s discovery that a group of news organizations was working on stories about the internal documents.
“At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie,” Facebook said in a prepared statement Friday. “The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”
Statements like these are the latest sign that Facebook has gotten into what Sophie Zhang, a former Facebook data scientist, described as a “siege mentality” at the company. Zhang last year claimed that the social network ignored fake accounts created to disrupt foreign elections. With more whistleblowers— notably Haugen—coming forward, it’s only gotten worse.
Continue reading: Facebook Cannot Fix Itself
“Facebook has been going through a bit of an authoritarian narrative spiral, where it becomes less responsive to employee criticism, to internal dissent and in some cases cracks down upon it,” said Zhang, who was fired from Facebook in the fall of 2020. “And this leads to more internal dissent.”
“I have seen many colleagues that are extremely frustrated and angry, while at the same time, feeling powerless and (disheartened) about the current situation,” one employee, whose name was redacted, wrote on an internal message board after Facebook decided last year to leave up incendiary posts by former President Donald Trump that suggested Minneapolis protesters could be shot. “My view is, if you want to fix Facebook, do it within.”
This story is based in part on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The Congress received the redacted versions from a number of news organisations, including The Associated Press.
The researchers detail data collected over many years on issues such as trafficking domestic workers in Middle East, a failure to crack down on Arabic content, which critics claim muzzles speech freedom while hate speech thrives, and widespread anti-vaccine misinformation. They also discuss subtle modifications in how people view their posts.
The company insists it “does not conduct research and then systematically and willfully ignore it if the findings are inconvenient for the company.” This claim, Facebook said in a statement, can “only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer.”
Haugen, who testified before the Senate this month that Facebook’s products “harm children, stoke division and weaken our democracy,” said the company should declare “moral bankruptcy” if it is to move forward from all this.
Continue reading: What can we do to fix Facebook?
It seems unlikely that this will happen at this stage. There is a deep-seated conflict between profit and people within Facebook — and the company does not appear to be ready to give up on its narrative that it’s good for the world even as it regularly makes decisions intended to maximize growth.
“Facebook did regular surveys of its employees — what percentage of employees believe that Facebook is making the world a better place,” Zhang recalled.
“It was around 70 percent when I joined. It was around 50 percent when I left,” said Zhang, who was at the company for more than two years before she was fired in the fall of 2020.
Facebook is yet to disclose the current number.