
Court filings allege Meta misled public on risks to children

A court filing alleges Meta misled the public about risks to children, revealing sex trafficking was tolerated on its platforms. The lawsuit involves over 1,800 plaintiffs, claiming Meta ignored harms to young users and failed to disclose issues like mental health impacts and contact with minors. Despite internal research suggesting dangers, Meta prioritized growth. Recent safety features have been implemented, but the lawsuit highlights past negligence.
A new court filing has revealed that sex trafficking was difficult to report and widely tolerated on Meta platforms. According to the filing, which was unsealed on Friday, the lawsuit was a part of a wider lawsuit filed against four social media companies.
In the plaintiffs brief, Instagram’s head of safety and well-being, Vaishnavi Jayakumar, testified that when she joined Meta in 2020, she learned that the company had a 17x strike policy for accounts that carried out trafficking of humans for sex. This meant that users could incur this infringement 16 times, with the company only suspending their account on the 17th occurrence. “By any measure across the industry, [it was] a very, very high strike threshold,” she added.
Meta accused of downplaying risks to children and misleading the public
The allegations against Meta stem from a brief filed in an unprecedented multi-district litigation. More than 1,800 plaintiffs, including children, parents, school districts, and state attorneys general, have banded together in the lawsuit. The lawsuit claims that parent companies behind TikTok, Snapchat, and YouTube “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.”
According to the brief filed by the plaintiffs in the Northern District of California, Meta was alleged to have been aware of serious harms on its platform and engaged in a pattern of deceit to downplay risks to young users. The platforms claim that internal company documents can corroborate the testimonies. In addition, the plaintiffs claimed Meta was aware that millions of adults were contacting minors on their platforms.
The plaintiffs also claimed that Meta was aware that its products increased mental health issues in teens, and content related to eating disorders, suicide, and child sexual abuse was frequently detected, yet hardly removed. The brief also noted that the company failed to disclose these harms to the public or to Congress, and also refused to implement safety fixes to protect young users from being exposed to them.
“Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” says Previn Warren, the co-lead attorney for the plaintiffs in the case. “Like tobacco, this is a situation where there are dangerous products that were marketed to kids,” Warren adds. “They did it anyway, because more usage meant more profits for the company.”
Brief paints a bad picture of its internal structure
The plaintiffs’ briefs, which were first reported by TIME, were based on sworn depositions of current and former Meta executives, internal communications, and company research and presentation materials obtained during the discovery process. It includes several quotes and experts from thousands of pages of testimony and internal company documents. TIME was unable to view the testimonies or research quoted in the brief at the time because they were sealed.
However, the brief still paints a bad picture of the company’s internal research and deliberations about issues that have affected its platforms since at least 2017. Plaintiffs noted that since 2017, Meta has been enticing young users, even though its internal research suggests its social media products could be addictive and dangerous to kids. According to the brief, Meta employees proposed ways to reduce these harms but were often blocked by executives.
Meanwhile, in the years after the lawsuit was filed, Meta has implemented new safety features designed to address some of the problems that the plaintiffs described. Last year, the company unveiled its Instagram Teen Account, which sets a user’s account private by default, provided they are between 13 and 18. In addition, it limits sensitive content, turns off notifications at night, and bars communications from unconnected adults.
The smartest crypto minds already read our newsletter. Want in? Join them.

