Meta Platforms Inc. and Snap Inc. are responsible for the suicide of an 11-year-old girl who suffered from depression after becoming addicted to Instagram and Snapchat, according to a lawsuit filed by the girl’s mother.
The woman claims her daughter Selena Rodriguez struggled for two years with an “extreme addiction” to the social media platforms before taking her own life last year.
Instagram and Snapchat have deliberately designed algorithms that keep teens hooked onto their platforms and “promote problematic and excessive use that they know is indicative of addictive and self-destructive use,” according to the lawsuit, filed in San Francisco’s federal court.
Mother Tammy Rodriguez, who lives in Connecticut, said her daughter ran away from home when she tried to impose limitations on Selena’s access to the platforms. She then took her daughter to a therapist who said “she had never seen a patient as addicted to social media as Selena.”
The lawsuit also notes that: “While on Instagram and Snapchat, Selena was constantly solicited for sexually exploitive content. She succumbed to the pressure and sent sexually explicit images using Snapchat, which were leaked and shared with her classmates, increasing the ridicule and embarrassment she experienced at school.”
Attorney Matthew Bergman, who founded the Social Media Victims Law Center in Seattle and represents Rodriguez’s mother, says the case is one of the first of its kind against Meta, formerly known as Facebook Inc.
A separate suit filed by Bergman last week involves a mother in Oregon who blames Meta and Snap for her 15-year-old daughter developing “numerous mental health conditions including multiple inpatient psychiatric admissions, an eating disorder, self-harm, and physically and mentally abusive behaviors toward her mother and siblings.”
The suit also states that the teen was “frequently messaged and solicited for sexually exploitive content and acts by adult men through the applications.”
Bergman added that he anticipates many more such lawsuits in the future, particularly after former Facebook employee Frances Haugen testified before the U.S. Senate Commerce Committee in October of last year.
“I am here today because I believe Facebook’s products harm children,” whistleblower Haugen told the Senate. A consumer protection investigation of Meta, based on documents shared by Haugen, was launched the following month, with a particular focus on how Meta was both aware of and intentionally hid evidence that Instagram harms children and teenagers.
In 2017, 14 year-old Molly Russell took her own life after becoming addicted to self-harm content uploaded to Instagram. Her father, Ian Russell, accused Facebook-owned Instagram of “helping to kill” his daughter. Haugen suggested that Instagram’s algorithms may have shown Molly harmful content before she had even searched for it.
In 2021, the Russell family had an app using their deceased daughter’s likeness for pornographic content removed from the Google Play Store. Molly’s photograph had been stolen by a “forced feminization” fetishist and used in male “sissy kink” stories.
According to a decade-long study published in 2021 in the Journal of Youth and Adolescence, “girls who started using social media at two to three hours a day or more at age 13, and then increased [that use] over time, had the highest levels of suicide risk in emerging adulthood.” Among boys, however, no such pattern emerged.
In the 2020 Netflix documentary The Social Dilemma, social psychologist Jonathan Haidt explains that suicide rates among young women and girls have skyrocketed since 2009 – the year that social media first became available on smartphones.
Reduxx is a newly-launched, independent source of pro-woman, pro-child safeguarding news and commentary. We’re able to continue our work exposing predators and reporting the truth thanks to the generous support of our readers.