A bombshell lawsuit against Meta, the parent company of Facebook, Instagram, and WhatsApp, has brought startling allegations to light from a former employee. Vaishnavi Jayakumar, who once led safety and well-being for Instagram, testified that Meta allegedly had a policy allowing accounts to rack up an astonishing 16 violations for “prostitution and sexual solicitation” before suspension on the 17th strike. This “17x” policy, she stated, was “very, very high” by industry standards for even human trafficking offenses.
Jayakumar also revealed that as of March 2020, Instagram lacked a direct way for users to report child sexual abuse material (CSAM). She expressed her surprise and claimed her attempts to raise the issue were dismissed due to the “too much work” it would require, despite other “far less serious violations” having in-app reporting options.
The lawsuit broadly accuses Meta and other tech companies like TikTok and Snapchat of contributing to a “mental health crisis” among teenagers, comparing their tactics to “Big Tobacco” in addicting young users for profit while intentionally omitting parental safeguards.
Meta strongly refutes these claims, stating they now operate under a “one-strike” policy for severe violations like human exploitation, leading to immediate account removal. They disagree with the allegations, calling them “cherry-picked quotes and misinformed opinions” that present a “deliberately misleading picture.” Meta asserts its commitment to protecting teens, citing new tools and parental controls developed over the past decade. The legal battle continues to unfold, putting a spotlight on platform safety and corporate accountability.