Social media platforms are facing added scrutiny for their practices and engagement with youth in the wake of growing concern about the adverse effects of online activity.
In the biggest challenge to operators of social platforms to date, 33 states recently filed a joint federal suit in California against Meta Platforms Inc. alleging the company is deliberately harming young people and contributing to a mental health crisis by designing features on Instagram and Facebook that attract and bind children to its platforms.
The suit claims that Meta routinely collects data on children under 13 without parental consent, in violation of federal law, as part of a broader “scheme to exploit young users for profit.”
Such practices have been on the radar of SafeOC, which has warned parents about the myriad online dangers children face online. From fraud to sexual exploitation to cyberbullying to hate and extremist recruiting, internet users navigate a sometimes dangerous minefield amid the useful and entertaining content. The SafeOC website also highlights the most popular apps used by young individuals and the varying dangers and security settings parents can use to protect their children.
According to the lawsuit, “Meta has repeatedly misled the public about the substantial dangers of its Social Media Platforms. It has concealed the ways in which these Platforms exploit and manipulate its most vulnerable consumers: teenagers and children.”
The lawsuit alleges Meta’s strategy involved four parts:
- A business model focused on maximizing young users’ time and attention on its Social Media Platforms;
- Harmful and psychologically manipulative product features to induce young users’ compulsive and extended platform use, while falsely assuring the public that its features were safe and suitable for young users;
- Misleading reports boasting a deceptively low incidence of user harms;
- Refusal to abandon its use of known harmful features.
The lawsuit states that Meta generates most of its revenue from advertisers, which target users based on the personal data Meta collects.
Concern has been mounting for years about internet addiction and the rapid rise and consequences of the dependence of youth on their devices and social media platforms. Whistleblower Frances Haugen first lifted the veil in 2021, showing the depths of the complicity of social media companies in attempting to lure and tie children to their products.
Haugen, a former data scientist at Facebook, leaked internal company data to the Wall Street Journal, including a study that found 13.5 percent of U.K. teen girls said their suicidal thoughts became more frequent after starting on Instagram, and another that found 17 percent said their eating disorders got worse after using Instagram. About 32 percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse, according to the Wall Street Journal.
Courts step in
Two years after the revelations, the courts are being asked to take action. The federal lawsuit in California, which seeks financial damages, restitution, and an end to Meta’s allegedly illegal practices, is not the only case.
Attorneys General in other states are also filing lawsuits, bringing the total number of states taking action to 41 and Washington, D.C. Additionally, a growing number of private citizens have launched litigation.
The far-reaching lawsuit filed in California alleges, “Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens.”
“Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its social media platforms,” the complaint reads. “It has concealed the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children.”
The states’ complaint states Meta knowingly violated the Children’s Online Privacy Protection Act by collecting data on children without informing and getting permission from their parents.
“Meta has been harming our children and teens, cultivating addiction to boost corporate profits,” California Attorney General Rob Bonta wrote. “With today’s lawsuit, we are drawing the line.”
Companies’ response
Although platforms such as TikTok or Snapchat may be looked at in the future, for now the focus is on Meta, which owns four of the biggest social media platforms, all with over 1 billion monthly active users each: Facebook (core platform), WhatsApp, Facebook Messenger, and Instagram.
In a statement, Meta said it is committed to safe, positive experiences online, and has already introduced over 30 tools to support teens and families.
“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company stated.
According to the complaint, Facebook CEO Mark Zuckerberg told Congress in 2021 that it was “a common misconception,” that the company sought to increase the amount of time that people spend on its platforms. “I don’t give our News Feed team or our Instagram team goals around increasing the amount of time that people spend,” he said.
Social media companies say they ban kids under 13 from signing up to their platforms, but many children are able to beat the bans, with and without parental consent, and many younger kids have social media accounts, according to the Associated Press.
TikTok recently introduced a default 60-minute time limit for users under 18. But once the limit is reached, minors can simply enter a passcode to keep watching. Snapchat and other social platforms have also been blamed for contributing to the youth mental health crisis but are not part of this lawsuit.
Government weighs in
A growing body of literature has examined the damage addiction to social media exerts on the mental health of youth and young adults. In May, the Surgeon General issued a new advisory about social media use and youth mental health.
According to the Surgeon General, “we don’t have enough evidence to say (social media is) safe, and in fact, there is growing evidence that social media use is associated with harm to young people’s mental health.”
Congress is taking a closer look as well. In a July New York Times op-ed, Senators Elizabeth Warren (D-MA) and Lindsey Graham (R-SC) pushed for a new federal agency to regulate Big Tech, while highlighting the harm they can do.
“Giant digital platforms have provided new avenues of proliferation for the sexual abuse and exploitation of children, human trafficking, drug trafficking and bullying and have promoted eating disorders, addictive behaviors and teen suicide,” the Senators wrote.
Warren and Graham have drafted the Digital Consumer Protection Commission Act, which would create a new federal commission to regulate digital platforms. Child advocacy groups such as Enough Is Enough, which seeks to protect children online, have come out in favor of the bipartisan legislation.
The group calls the act, a “crucial piece of legislation that marks a significant step forward in our collective efforts to combat the internet exploitation of children and ensure a safer digital environment for all.”
For more information on the leading apps and how to protect your children, visit SafeOC.
Sign up for the ReadyOC newsletter and the SafeOC newsletter to receive local updates, public safety alerts, and tips.