Instagram is being sued for using addictive features linked to depression, anxiety in kids

Meta Platforms and its subsidiary Instagram, are being sued for allegedly contributing to a youth mental health crisis by deliberately fostering addictive behaviours on their platforms.

Children Screen Time Rockets

Meta Platforms and its Instagram unit have been accused of fuelling a youth mental health crisis by making their social media platforms addictive. Source: Getty / Matt Cardy

Dozens of US states are suing Meta Platforms and its Instagram unit, accusing them of fuelling a youth mental health crisis by making their social media platforms addictive.

In a complaint filed in the Oakland, California, federal court on Tuesday, 33 states including California and New York said Meta repeatedly misled the public about the dangers of its platforms, and knowingly induced young children and teenagers into addictive and compulsive social media use. Meta also operates Facebook.

"Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens," the complaint said. "Its motive is profit."
Children have long been an appealing demographic for big business in the hope of attracting them as consumers at a young age when they may be more impressionable and solidifying brand loyalty. For Meta, younger consumers may help secure more advertisers who hope kids will buy their products as they grow up.

But the states noted research has associated children's use of Meta's social media platforms with "depression, anxiety, insomnia, interference with education and daily life, and many other negative outcomes."
Meta said it was "disappointed" in the lawsuit.

"Instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path," the company said.

Eight other US states and Washington, DC are filing similar lawsuits against Meta on Tuesday, bringing the total number of authorities taking action against the California-based company to 42.

The cases are the latest in a string of legal actions against social media companies on behalf of children and teens.
Facebook - Meta Headquarters in Menlo Park, CA
Meta's headquarters in Menlo Park, California. Thirty-three states, including California and New York, alleged that Meta consistently deceived the public about the risks of its platforms and encouraged addictive social media use among young people. Source: Getty / Tayfun Coskun
Meta, ByteDance's TikTok and Google's YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictiveness of social media.

In Tuesday's cases, Meta could face civil penalties of A$1,570 to A $78,500 for each violation of various state laws — an amount that could add up quickly given the millions of children and teens who use Instagram.
Much of the focus on Meta stemmed from a whistleblower's release of documents in 2021 that showed the company knew Instagram, which began as a photo-sharing app, was addictive and worsened body image issues for some teen girls.

The lawsuit by the 33 states alleged that Meta has strived to ensure that young people spend as much time as possible on social media despite knowing that they are susceptible to the need for approval in the form of "likes" from other users about their content.

"Meta has been harming our children and teens, cultivating addiction to boost corporate profits," said California Attorney General Rob Bonta, whose state includes Meta's headquarters.
The company was also accused of violating a law banning the collection of data of children under age 13, and deceptively denying that its social media was harmful.

"Meta did not disclose that its algorithms were designed to capitalise on young users' dopamine responses and create an addictive cycle of engagement," it added.

Dopamine is a type of neurotransmitter that plays a role in feelings of pleasure.
The complaint said Meta's refusal to accept responsibility extended to the company's effort last year to distance itself from a 14-year-old girl's suicide in the UK after she was exposed on Instagram to content about suicide and self-injury.

A coroner rejected a Meta executive's claim that such content was "safe" for children, finding that the girl likely binged on harmful content that normalised the depression she had felt before killing herself.

The states alleged Meta is seeking to expand its harmful practices into virtual reality, including its Horizon Worlds platform and the WhatsApp and Messenger apps.

Share
4 min read
Published 25 October 2023 8:10am
Updated 25 October 2023 1:15pm
Source: Reuters



Share this with family and friends