As Frances Haugen took her seat in an opulent Victorian-style room in London’s Houses of Parliament on Monday, she remarked that the surroundings were much grander than those in Washington, where the Facebook whistleblower had given evidence to Senators two weeks earlier.
One member of the panel of British lawmakers whom she was there to testify before responded that the small room, decorated with ruby-red wallpaper and grand paintings, had been chosen for precisely that purpose: COVID-19 restrictions meant the hearing was closed off to the general public and most journalists—so there was no need to hold it in the drab office building across the road, where evidence sessions usually accommodate a larger audience.
Haugen’s stop in London was the first on an extensive European tour planned for November, in which the whistleblower will meet lawmakers from across the continent who are drafting laws to place new restrictions on Facebook and other Big Tech companies. The effort is part of a push by Haugen and her team to turn up the heat on Facebook in jurisdictions that historically have been faster—and more willing—to regulate the impact that American tech giants have on people’s lives and the societies they live in. Efforts to rein in Facebook’s algorithms in the U.S. have hit roadblocks of partisan disagreement over alleged censorship, even as Haugen’s disclosures have brought Republicans and Democrats closer to agreement on the need to regulate the company.
Learn more How the E.U’s Sweeping New Regulations Against Big Tech Could Have an Impact Beyond Europe
“Mark Zuckerberg has unilateral control over 3 billion people,” Haugen said of Facebook’s CEO as the hearing got underway. “There’s no will at the top to make sure these systems are run in an adequately safe way. And I think until we bring in a counterweight, things will be operated for the shareholders’ interest and not for the public interest.”
The Facebook whistleblower was in London on the same day that an avalanche of new stories were published by more than a dozen publications—based on internal Facebook documents that she had shared with them—weeks after an initial batch was Published by Wall Street Journal The “Facebook Papers” stories published on Monday revealed further details about how the company fails to moderate harmful content in developing countries, how it stumbled ahead of the 2020 U.S. election, and how it was aware of the fact that Filipina maids were being abused and sold on the platform, but did little to act.
In a quarterly earnings call on Monday, in which the company disclosed increased year-on-year profits, Zuckerberg called the stories “a coordinated effort to selectively use leaked documents to paint a false picture of our company.” Facebook spokespeople have said the company welcomes government regulation and that it spends more on safety work than its competitors.
Learn more The 5 Most Important Revelations From the ‘Facebook Papers’
As the “Facebook Papers” stories added to mounting evidence that Facebook systemically chose profit over safety time and time again, Haugen was embarking on the next stage of her project: making sure new laws around the world reflect the internal reality of how giant social media companies work.
The lawmakers quizzing Haugen in London were part of a multi-party committee scrutinizing the U.K.’s new wide-ranging Online Safety Bill, which, if passed, would require tech companies to prevent “online harms” or face stricter penalties.
Haugen was a supporter of the bill. “I am incredibly excited and proud of the U.K. for taking such a world-leading stance with regard to thinking about regulating social platforms,” Haugen told the lawmakers. “The Global South currently does not have the resources to stand up and save their own lives. They are excluded from these discussions.” she claimed. “The U.K. has a tradition of leading policy in ways that are followed around the world. I can’t imagine Mark [Zuckerberg] isn’t paying attention to what you’re doing. This is a critical moment for the U.K. to stand up and make sure these platforms are in the public good, and are designed for safety.”
Haugen was quite vocal about Facebook’s ill-effects during her testimony. This is one of the side effects of the decision not to have the hearing held in person. Old, unconditioned roomThe actual temperature of the air rose quickly too. Officials opened two windows to try and cool off during a break in proceedings. Haugen commented, while she was taking a moment to breathe, that Washington’s Committee Room had actually been cooler.
As the hearing progressed, however, it was clear that the types of regulations that Haugen proposed and those that Haugen would recommend were very different.
For starters, Haugen’s central recommendation is for regulation to focus on the algorithmic amplification systems that, according to internal Facebook research, systemically boosts divisive and polarizing content. The draft U.K. bill focuses largely on content and not algorithmic amplification.
TIME interviewed Damian Collins who is chair of the panel that examines the bill. Collins has the power and authority to propose amendments and suggested that he might try to amend the bill to emphasize systemic algorithmic harms in light evidence from Haugen, an ex-Facebook insider, and whistleblower. Sophie ZhangLast week, he testified before his committee.
“The analysis of the [algorithmic]All of these systems are essential. This can’t just be about content moderation,” Collins told TIME. “That is something we’re looking at.”
Collins noted that the U.K.’s online safety draft bill would give powers to a national watchdog that could, if it wanted to, effectively subpoena Facebook for evidence about how its algorithms work, but that his committee was contemplating whether to change the wording of the law to recommend specific requests that it could make.
Learn more Damian Collins Is Leading Britain’s War Against Facebook
In her testimony, Haugen stated that a perfect piece of regulation would require Facebook to reveal which safety measures are in place, which languages they’re available and what success rates they have in doing their job, broken down according to language. Doing so, she suggested, would expose the inequities in Facebook’s global safety apparatus, and increase the incentive for it to fix its systems.
Collins’s proposal appeared to have an impact on legislators with the ability to change Big Tech legislation. “I think that is a really good suggestion,” Collins said. “This is not just a bill about content.”