SLO County assemblyman’s bill would let parents sue addictive social media platforms
A San Luis Obispo County legislator is taking on major social media platforms like TikTok and Instagram, saying their targeted algorithms are harming kids.
And he wants to give parents the power to sue them when it happens.
California State Assemblyman Jordan Cunningham, R-Templeton, recently coauthored Assembly Bill 2408, which introduces consequences for the addictive algorithms used by social media companies that impact the mental health of children and teens.
The bill was introduced on Feb. 17.
The bill is a bipartisan effort by Cunningham and Assembly member Buffy Wicks, D-Berkeley, and, if passed, could open the floodgates for litigation by families and individuals suffering from mental health impacts due to social media.
“Who’s going to pay to remediate that harm that’s been done to our kids?” Cunningham said in an interview with The Tribune on Friday. “Is it going to be the parents and the kids and health insurance? And the schools? Or should some of that social costs be borne by the companies that created the products in the first place?”
Assembly Bill 2408 is different from past attempts to regulate Big Tech companies because it focuses narrowly on social media platforms like TikTok and Instagram that use targeted algorithms to attract young users, Cunningham told The Tribune.
Cunningham said whistleblowers have shown that social media companies know young users are negatively impacted by the addictive algorithm their products use to keep them coming back.
Former Facebook data scientist Frances Haugen testified in front of a Senate subcommittee in October 2021 about how social media companies knew their products were harming children.
Haugen shared internal documents showing teenagers in the United Kingdom and United States who used Meta products — namely Instagram — reported depression, negative body image and suicidal thoughts after using the products, according to reporting by The Wall Street Journal. Meta is the new name for Facebook’s parent company.
Cunningham said this evidence was particularly compelling and created a sense of urgency around regulating how social media platforms impact young users.
“I think it confirmed what a lot of us had suspected for a long time, which is that some of these large social media platforms are intentionally designing their algorithms to foster addictive behavior, and getting kids hooked on their product,” Cunningham said. “And in certain cases, that’s causing significant harm.”
Social media algorithms target teens and children
Cunningham, who is a father of four — three of them teenagers — said the issue is personal for him and his constituents.
He said he’s heard from many local parents that their teenagers are spending more time on platforms like TikTok and Instagram.
That’s worrisome given those platforms’ targeting techniques — Cunningham gives the example of a teenage girl receiving a targeted ad for diet pills next to a photo of an exceptionally thin model.
The health costs associated with harmful social media algorithms targeting teens can compound over time, he said.
“You’ve gotten young people addicted to your products. You’ve made billions of dollars doing that and there’s billions of dollars of health costs on the other side of the ledger,” he said.
Cunningham compared the algorithms used by social media companies to the marketing tactics of Big Tobacco to get teenagers to smoke cigarettes.
“This is not much different than imposing liability on tobacco companies for marketing cigarettes to kids, which they did for decade until we passed the laws and said, ‘Hey, you got to pay the cost of that,’” he said.
In Facebook’s internal teen mental health study that was first shared by the Wall Street Journal, they said teenagers use the language of addiction when describing their use of Instagram.
“They have an addict’s narrative about their use — it can make them feel good, feel bad,” read the documents. “They wish they could spend less time caring about it, but they can’t help themselves.”
Cunningham said he felt the increase in social media usage among teenagers may not have caused the mental health crisis that age group is experiencing, but it is certainly related to it.
The impacts are also felt locally, where psychiatric treatment options for teenagers come with a long wait list.
Cunningham said a local doctor told him it can take between three and four months for a San Luis Obispo County teenager to secure an appointment with a local psychiatrist.
Teenagers with severe mental illness need to leave the county in order to find inpatient mental health services.
Assemblyman prepared for fight over new bill
Assembly Bill 2408 isn’t the first time Cunningham and other legislators have tried to take aim at Big Tech, but it is different from other attempts at regulation because it is narrowly focused and creates actual penalties and damages, Cunningham said.
For example, the California Consumer Privacy Act, which was introduced in 2018, takes a broad sweep at companies that collect consumer data, including companies like Google and Amazon.
AB 2408 doesn’t address consumer data or apply to companies like Google and Facebook.
“It would be a new duty and a new cause of action,” Cunningham said. “It would allow the attorney general of the state to enforce it. It would allow parents to sue on behalf of their children.”
If the bill does get signed into law by Gov.nGavin Newsom, Cunningham said he anticipates many large class-action lawsuits will be introduced by parents on behalf of their children.
Cunningham said he hopes the threat of litigation would serve as a deterrent for companies building addictive algorithms into products marketed toward children.
He suspects TikTok and Meta will introduce plenty of lobbying to try and prevent the bill from being passed, but he is prepared for the long haul.
“I suspect it will be a fight, but I think it’s a righteous cause,” he said. “Somebody’s got to do this.”
Push for age-appropriate social media design
While AB 2408 tries to remediate the damage caused by social media platforms, Cunningham’s partner across the aisle, Wicks, is authoring a companion bill, AB 2273, which tries to pave a way forward for age-appropriate design in social media products.
Implementing age-appropriate design standards in California shouldn’t be a heavy lift for these companies, Cunningham said.
He explained they already implemented such design standards into their products after a similar regulation was passed in the United Kingdom, Cunningham said.
“The bottom line is, if you’re going to create a product that children use, and you know children are using your product, you should want to design it in a way that doesn’t result in kids seeking psychiatric help,” Cunningham said.
“And the companies that profited off of this and caused harm to those kids should cover some of the social costs of that decision,” he added.
What’s next for social media bill?
Cunningham anticipates the bill will be reviewed by various committees before it reaches the Assembly floor.
If it receives a majority vote, it will move on to the California State Senate, where it will go through the same process.
“I think if we can get the bills to the governor and get them signed, I think we’ll be in a much better place as a society,” he said.