An American mother claims companies are responsible for the suicide of her 11-year-old daughter, who had an “extreme addiction” to social media.
Meta Platforms Inc. and Snap Inc. are to blame for the suicide of an 11-year-old child who was addicted to Instagram and Snapchat, the girl’s mother has alleged in a lawsuit.
The woman claims her daughter Selena Rodriguez struggled for two years with an “extreme addiction” to Meta’s photo-sharing platform and Snap’s messaging app before killing herself last year.
The lawsuit in federal court in San Francisco isn’t the first lawsuit to blame a youth’s suicide on social media, but it comes at a sensitive time for the platforms that engage millions of young people around the world. .
In November, a group of US state attorneys general announced an investigation into Instagram’s efforts to appeal to children and young adults, targeting the risks the social network may pose to their mental health and well-being. The states investigation was launched after a former Facebook employee turned whistleblower testified before Congress that the company knew about, but did not disclose, the adverse effects of its services like Instagram.
Social media backlash not limited to US The company told the BBC it does not allow content promoting self-harm.
“We are devastated to learn of Selena’s passing and our hearts go out to her family,” a Snap spokesperson said in an emailed statement Friday. “While we cannot comment on the specifics of active litigation, nothing is more important to us than the well-being of our community.”
Meta and Snap knew or should have known that “their social media products were harming a significant percentage of their underage users,” according to Thursday’s lawsuit. “In other words, the defendants intentionally created an attractive nuisance for young children, but failed to provide adequate safeguards against the adverse effects they knew to occur on their wholly owned and controlled digital premises.”
Meta representatives did not respond to an email seeking comment.
A spokesperson for Meta said in November that claims that the company puts profit before safety are false and that “we continue to build new features to help people who may face negative social comparisons. or body image issues.
Snap said in May it was suspending projects with two app makers “out of an abundance of caution for the safety of the Snapchat community” in light of a wrongful death and class action lawsuit filed in California that accused the companies of failing to not have applied their own policies against cyber-harassment.
Tammy Rodriguez, who lives in Connecticut, said that by trying to limit her daughter’s access to platforms, the girl ran away from home. She took her daughter to a therapist who said “she had never seen a patient as addicted to social media as Selena,” according to the lawsuit.
The lawsuit directs its harshest criticism at Snapchat, saying the platform rewards users in “excessive and dangerous ways” for their engagement. The mother alleges allegations of product defect, negligence and violation of California consumer protection law. One of the attorneys in the case is from the Social Media Victims Law Center, a Seattle-based legal advocacy group.
“Snapchat helps people connect with their real friends, without some of the public pressure and social comparison features of traditional social media platforms, and intentionally makes it difficult for strangers to contact young people,” the spokesperson said. Snap. “We are working closely with many mental health organizations to provide integrated tools and resources to Snapchatters as part of our ongoing work to keep our community safe.”
Social media companies have largely been successful in fending off lawsuits accusing them of bodily harm thanks to a 1996 federal law that protects internet platforms from liability for what users post online.
The case is Rodriguez v. Meta Platforms Inc. f/k/a Facebook Inc. 3:22-cv-00401, US District Court, Northern District of California (San Francisco).
(Updates with Snap commentary.)
–With help from Naomi Nix.