Facebook products 'harm children, stoke division,' whistleblower testifies
WASHINGTON - A Facebook whistleblower told Congress on Tuesday that the social media company and its products harm children and fuel hate and misinformation in the U.S., but its leaders refuse to make changes because of a desire to put "astronomical profits before people."
"I believe Facebook’s products harm children, stoke division, and weaken our democracy," Frances Haugen, a former Facebook data scientist, testified to Senate Commerce subcommittee on consumer protection. "The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people."
"Congressional action is needed. They won’t solve this crisis without your help," she added.
Haugen, a 37-year-old data expert from Iowa, was testifying for the first time before Congress after exposing Facebook’s awareness of apparent harm to some teens from Instagram and sharing accusations of dishonesty in its fight against polarization in the world.
She was identified Sunday in a "60 Minutes" interview as the whistleblower who anonymously filed complaints with federal law enforcement. The complaints alleged that Facebook’s own research shows how it magnifies hate and misinformation, leads to increased polarization and that Instagram, specifically, can harm teenage girls' mental health.
RELATED: Whistleblower claims Facebook fed US Capitol riot, magnified misinformation
The information was first reported by the Wall Street Journal in a series of stories, called "The Facebook Files," which painted a picture of a company focused on growth and its own interests instead of the public good.
Haugen came forward with the wide-ranging condemnation of Facebook — complete with tens of thousands of pages of internal research documents she secretly copied before leaving her job in Facebook’s civic integrity unit.
Haugen has a degree in computer engineering and a master’s degree in business from Harvard. She worked for 15 years prior to being recruited by Facebook in 2019 at companies including Google and Pinterest.
Former Facebook employee Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' on Capitol Hill October 5, 2021, in Washington, D.
RELATED: Whistleblower claims Facebook fed US Capitol riot, magnified misinformation
Facebook issued a statement in response to Haugen’s testimony, saying the company doesn’t agree with "her characterization of many issues."
"Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives — and testified more than six times to not working on the subject matter in question," said Lena Pietsch, Facebook’s director of policy communications. "We don’t agree with her characterization of the many issues she testified about.
"Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act," Pietsch continued.
The Senate panel is examining Facebook’s use of information from its own researchers on Instagram that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts.
For some of the teens devoted to Instagram, the peer pressure generated by the visually focused app led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed. One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
"The company intentionally hides vital information from the public, from the U.S. government and from governments around the world," Haugen said in her testimony. "The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages."
The social network giant has 2.8 billion users worldwide and nearly $1 trillion in market value. By coming forward, Haugen says she hopes it will help spur the government to put regulations in place for Facebook’s activities.
LLike fellow tech giants Google, Amazon and Apple, Facebook has for years enjoyed minimal regulation in Washington. A number of bipartisan legislative proposals for the tech industry address data privacy, protection of young people and anti-competitive conduct. But getting new laws through Congress is a heavy slog. The Federal Trade Commission has adopted a stricter stance recently toward Facebook and other companies.
"When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action. And today, the government is taking action against companies that hid evidence on opioids," Haugen testified. "I implore you to do the same here."
In dialogue with receptive senators of both parties, Haugen, who focused on algorithm products in her work at Facebook, explained the importance to the company of algorithms that govern what shows up on users’ news feeds. She said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.
Despite the hostility that the new algorithms were feeding, Facebook found that they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate most of its revenue.
Haugen said Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump last year, alleging that contributed to the deadly Jan. 6 assault on the U.S. Capitol.
After the November election, Facebook dissolved the civic integrity unit where Haugen had been working. In the "60 Minutes" interview, she said that was the moment she realized: "I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous."
Haugen said she believed Facebook didn’t set out to build a destructive platform. But "in the end, the buck stops with Mark," she said referring to Zuckerberg, who controls more than 50% of Facebook’s voting shares. "There is no one currently holding Mark accountable but himself."
Haugen said she believed that Zuckerberg was familiar with some of the internal research showing concerns for potential negative impacts of Instagram.
Prior to the hearing, Facebook has maintained that Haugen’s allegations are misleading and has insisted there is no evidence to support the premise that it is the primary cause of social polarization.
"Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, we’re never going to be absolutely on top of this 100% of the time," Nick Clegg, Facebook’s vice president of policy and public affairs, said Sunday on CNN’s "Reliable Sources."
Clegg also wrote to Facebook employees in a memo last week that "social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out."
Amid the Instagram research claims, Facebook paused the development of a kids' version of Instagram, geared toward children under 13, to address concerns that have been raised about the vulnerability of younger users.
"I would be sincerely surprised if they do not continue working on Instagram Kids," Haugen told the Senate panel. "Facebook understands that if they want to continue to grow, they have to find new users, they have to make sure that the next generation is just as engaged with Instagram as the current one. And the way they’ll do that is by making sure that children establish habits before they have good self-regulation."
RELATED: Facebook pauses development of 'Instagram Kids' app after pushback
Separately, a massive global outage took down Facebook, Instagram and the company’s WhatsApp messaging platform on Monday. Facebook didn’t say what might have caused the outage, which began around 11:40 a.m. EDT and was still not fixed more than six hours later.
RELATED: Facebook, Instagram, Whatsapp down: Platforms 'coming back online' after outage
The Associated Press contributed to this report. It was reported from Cincinnati.