From Los Angeles Times:
“We make body image issues worse for 1 in 3 teen girls.”
“Teens blame Instagram for increases in the rate of anxiety and depression.”
Those findings from the company’s internal research were among many alarming disclosures made public last year when former Facebook product manager Frances Haugen blew the whistle on the social media giant.
She revealed the company knew its Instagram platform was harmful to the mental health of a large share of young users, who reported feeling addicted to scrolling images on the app even though it made them feel bad about themselves or, in some cases, promoted eating disorders and self-harm. But it pursued engaging teenagers as key to growing its profits.
“Left alone, Facebook will continue to make choices that go against the common good, our common good,” Haugen said in testimony to Congress in October, imploring the government to regulate social media, as it has done with other consumer products — like cars and cigarettes — that pose hazards to the public.
A new Texas law violates the 1st Amendment by making it illegal for social media companies to remove content because of its “viewpoint.”
The dangers here are the algorithms and other features designed to suck people in and keep them on a platform by showing content the user is likely to respond to — even if it’s more dangerous or extreme than what they first searched for or simply not age-appropriate. Haugen told Congress that Facebook research shows a search for healthful recipes could push a user toward posts promoting anorexia. An investigation by the Wall Street Journal found that TikTok’s algorithms serve up videos of sexual violence and drug use to accounts owned by 13-year-olds.
Elected officials, legal scholars and child welfare advocates are grappling with how to make the internet safer for kids and teens while respecting 1st Amendment rights and not hampering the benefits of technological innovation. California lawmakers have an opportunity to keep this debate alive with two bills that face votes in the Assembly this week.
Assembly Bill 2408 would hold social media companies liable for harm caused to children who become addicted to their platforms, essentially prodding the companies to eliminate addictive features from minors’ accounts and giving parents new rights to sue if they don’t. While the legislation does not list specific features, those could include auto-play features that spit out a continuous stream of videos, notifications that pop up 24 hours a day, algorithms that serve up enticing but harmful content or an endless scrolling design that’s fashioned to keep users on the site.
This bill is controversial because it creates a new avenue for lawsuits. Social media companies lobbying against the bill say it is so onerous that they would wind up booting minors from their platforms rather than risk getting sued. They argue that the bill may be found to be unconstitutional for restricting how platforms present information.
Online platforms should be liable for harmful products and misleading public statements.
Assembly Bill 2273 is more focused on data privacy. It would require online services used by children to be designed in “age-appropriate” ways — such as banning location tracking and defaulting social media accounts to the most private settings — but does not take aim at addictive features or include an enforcement mechanism. It would establish a task force within the state’s new data privacy agency to figure out a lot of the details, such as how platforms should verify users’ age and how to communicate privacy information in kid-friendly terms.
If the bills clear the Assembly this week they will advance to the Senate, where they will continue to be debated and refined throughout the summer. So there is more time to work out important questions raised by the legislation if Assembly members act this week to keep the debate alive.
Of course, regulating global platforms one state at a time is less than ideal. We would much rather see Congress take action to make the internet safer for all American children, which it can do by passing the Kids Online Safety Act. This bipartisan legislation would require social media platforms to create tools allowing parents to modify algorithms and eliminate features, such as auto-play, that extend time online. And it establishes an important obligation for social media companies to act in the best interest of minors by preventing promotion of self-harm, suicide, eating disorders, substance abuse and sexual exploitation.
But California shouldn’t wait for Washington to act. As the home of Silicon Valley, the state that has brought life-altering technologies to the world has an obligation to help remedy their pitfalls. There’s too much at stake to allow Congress to drag its feet.
Technology and the Internet
After an unlikely rise to national prominence, whistleblower Frances Haugen has turned her energy to legislative advocacy, some of it involving California law.
Serious mental health problems among high school students skyrocketed during the same decade that teens’ use of cellphones and social media became pervasive. From 2009 to 2019, the portion of high schoolers reporting “persistent feelings of sadness or hopelessness” increased by 40%, to more than 1 in 3 students, according to the U.S. surgeon general’s advisory last year on the national crisis in youth mental health. During the same decade, the number considering suicide shot up 36% — to about 1 in 5 high schoolers.
The effect of technology “almost certainly varies from person to person,” the surgeon general’s report says, citing research that shows both negative and positive consequences of teens’ social media use. While some research shows time spent online leads to depression and anxiety, other research shows it helps people form meaningful connections with friends and family.
“Even if technology doesn’t harm young people on average, certain kinds of online activities likely do harm some young people,” the report concludes. In particular, passive social media activities — such as scrolling through posts and auto-play videos — are more linked with declines in well-being, compared with active use such as commenting on posts or recording videos.
More recent research has shown that social media platforms, including Instagram and Snapchat, have made it easy for teenagers to find and buy deadly drugs, such as pills laced with fentanyl.
Meta, the rebranded company that owns Facebook and Instagram, says it has developed new tools to help parents supervise their kids on the platforms, such as seeing how much time they’re spending on those sites and defaulting new teen accounts to higher privacy settings. That’s good. But it should not stop lawmakers from pushing the companies to go further.
California should keep the pressure on to make it clear to Congress and the tech industry that American families want stronger protections for kids online. If the nation doesn’t act to curb dangerous social media features, the states must.
A cure for the common opinion
Get thought-provoking perspectives with our weekly newsletter.
You may occasionally receive promotional content from the Los Angeles Times.
The Los Angeles Times’ editorial board determines the editorial positions of the organization. The editorial board opines on the important issues of the day – exhorting, explaining, deploring, mourning, applauding or championing, as the case may be. The board, which operates separately from the newsroom, proceeds on the presumption that serious, non-partisan, intellectually honest engagement with the world is a requirement of good citizenship. You can read more about the board’s mission and its members at the About The Times Editorial Board page.
Subscribe for unlimited access