WASHINGTON—Federal Trade Commission staffers have begun looking into disclosures that
Facebook Inc.’s
internal company research had identified ill effects from its products, according to people familiar with the matter.
Officials are looking into whether Facebook research documents indicate that it might have violated a 2019 settlement with the agency over privacy concerns, for which the company paid a record $5 billion penalty, one of the people said.
The FTC declined to comment.
The internal research found evidence that the company’s algorithms foster discord and that its Instagram app is harmful for a sizable percentage of its users, notably teenage girls, among other findings. The documents provided the foundation for The Wall Street Journal’s Facebook Files series.
In a statement, Facebook said that it is “always ready to answer regulators’ questions and will continue to cooperate with government inquiries.”
In a regulatory filing Tuesday, Facebook said that in September it “became subject to government investigations and requests relating to a former employee’s allegations and release of internal company documents concerning, among other things, our algorithms, advertising and user metrics, and content enforcement practices, as well as misinformation and other undesirable activity on our platform, and user well-being.”
The company has previously said that many of the research documents released by former Facebook product manager
Frances Haugen
have been misinterpreted and that the company has “invested heavily in people and technology to keep our platform safe.”
The Facebook documents have triggered calls by lawmakers and children’s advocates for the FTC to investigate whether Facebook engaged in deceptive or misleading conduct.
Sen. Richard Blumenthal
(D., Conn.), who chairs the Senate consumer protection subcommittee, said one concern for the FTC is whether Facebook withheld information from the agency concerning its internal research.
“I think the FTC should be really angry if Facebook concealed this material from them as it did from us in the Congress and the public,” Mr. Blumenthal said in an interview. He said that he and
Sen. Marsha Blackburn
(R., Tenn.) asked Facebook in August about internal research into the effects of its products on children, “and they evaded our questions.”
Three other lawmakers—
Sen. Ed Markey
(D., Mass.) and Reps.
Kathy Castor
(D., Fla.) and
Lori Trahan
(D., Mass.)—sent a letter to the FTC on Oct. 8. They urged the commission to use its enforcement powers to make sure that all “powerful technology platforms comply with their public statements and policies on children’s and teen’s [sic] privacy.”
Separately, the Securities and Exchange Commission has been communicating with attorneys for Ms. Haugen, according to one of the lawyers representing her. The SEC hasn’t commented.
SHARE YOUR THOUGHTS
What do you expect to come from the FTC probe into Facebook? Join the conversation below.
The FTC also has been in communication with Ms. Haugen’s team, according to another of the people familiar with the matter.
Unlike the SEC, the FTC doesn’t have a formal program to protect whistleblowers like Ms. Haugen. The FTC is nonetheless a key government regulator of business conduct on the internet in its role policing the marketplace for unfair and deceptive trade practices.
One issue likely being explored by FTC staffers is whether Facebook had a legal obligation to warn users about the risks revealed by internal research findings, said former FTC Chairman
William Kovacic,
now a law professor at George Washington University. If Facebook failed to do so, that could constitute a deceptive trade practice, he said.
David Vladeck,
a former head of the FTC’s consumer-protection bureau, said the agency also could consider whether the company was acting appropriately, given the findings of the company research. However, he added, any case might not be easy to prove.
“You have to take into account Facebook’s denial that its research really shows harm, and [its position] that the whistleblower has misstated or misrepresented the research,” he said.
Social Media
More WSJ coverage, selected by the editors
The FTC launched a study of social-media platforms last December, asking detailed questions of Facebook and other internet companies about how they tailor their services to children.
“The questions push to uncover how children and families are targeted and categorized,” several FTC commissioners wrote in a statement at the time.
One question, for example, required companies to produce “all strategies, plans, presentations, Analyses, machine learning or artificial intelligence, and/or efforts to Identify usage patterns associated with Children and Teens, validate results, and/or monetize this usage, Including all efforts to maintain and/or increase User Engagement by Children and Teens.”
Facebook’s 2019 settlement with the FTC came in response to concerns that millions of Facebook users’ information had been improperly shared with a political data-analytics firm, Cambridge Analytica. The FTC required Facebook to toughen its privacy and data-security protections.
That settlement absolved Facebook and its top officials of any other consumer-protection violations known to the FTC at the time, which was criticized as overly broad by Democrats on the FTC who dissented from the decision.
Some observers, including Mr. Vladeck, don’t think that will be an obstacle, because the FTC wouldn’t have been aware of the disclosures in the documents Ms. Haugen released until recently.
The FTC is now led by
Lina Khan,
a critic of big tech companies who is aiming to make the commission a more energetic industry watchdog.
“Under the new management, they recognize their role and the commitment they’ve made, and they intend to do a better job,” said
Jeff Chester,
executive director of the Center for Digital Democracy, a nonprofit that advocates for privacy and consumer protection online.
Write to John D. McKinnon at john.mckinnon@wsj.com and Brent Kendall at brent.kendall@wsj.com
Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8