Skip to main content

Opinion | Congress Must Pass the Kids Online Safety Act by Christmas

‘Anxious Generation’ scholars argue that KOSA will protect children and is not a threat to free speech

Published on: December 16, 2024

a young child from the anxious generation with a cell phone in hand against a dark background
Photo:
iStock

Imagine the following situation: There is a toy that kills dozens or hundreds of children every year and harms millions more. The toymaker makes the toy available for free to any child who can reach the Internet, so parents cannot stop their children from playing with the toy unless they tightly monitor and control their children’s access to the Internet, even at school. Because parents feel so powerless, the toy becomes their greatest fear, and the vast majority want the government to compel the toymaker to remove a few of the toy’s most dangerous features. Year after year, the government does nothing.

After more than a decade of mounting evidence of harm and rising parental concerns, Congress finally acts in a stunningly bipartisan fashion to craft a modest bill that would remove a few (just a few) of the toy’s most dangerous features, without restricting children’s continued access to the toy.

Snapchat has also been a crucial marketplace for illicit drugs, compounding the teen opioid crisis.

The bill is designed and modified carefully over several years and many hearings to take into account every conceivable objection from the left and from the right. It passes the Senate by a landslide vote of 91 to 3 and is then sent to the House of Representatives, where it also enjoys strong bipartisan support. And then, after all that work and all that support, the House leadership kills the bill without giving any believable justification.

This situation would be a travesty of democracy and common sense, and yet it is exactly what is happening with social media and the Kids Online Safety Act (KOSA), which has only one more week to be enacted. Speaker Mike Johnson said that he is killing the bill because he still has free speech concerns, but as we’ll show, this objection is not grounded in reality.

Industrial-scale harm committed by Snap, ByteDance and Meta

For documentation of our claim that social media is as harmful as the hypothetical toy described above, you can see our many essays on After Babel. We’ll also add a few more facts here to establish that we are not exaggerating the harm or the need for greater protection of children and adolescents.

We know from the briefs filed by many attorneys general, which reveal internal communications from social media companies, that Snap gets 10,000 reports of sextortion per month. As reported in State of New Mexico vs. Snap Inc.

“Snap was specifically aware, but failed to warn children and parents, of ‘rampant’ and ‘massive’ sextortion on its platform — a problem so grave that it drives children facing merciless and relentless blackmail demands or disclosure of intimate images to their families and friends to suicide. Snap trust and safety employees acknowledged the ‘psychological impact’ of sextortion on its victims ‘especially when those victims are minor.’ By November 2022, Snap employees were discussing 10,000 user reports of sextortion each month, while acknowledging that these reports ‘likely represent a small fraction of this abuse’ given the shame and other barriers to reporting.”

The shame and fear that hits each and every one of these young victims is a huge cost in itself, and that cost climbs far higher when we consider that some of them then end their own lives.

An FBI report investigated 12,600 cases of online sextortion that occurred between October 2021 and March 2023. They identified 20 deaths by suicide directly linked to these cases. Those teens would still be alive if not for a product that, by design, easily connects children with adult strangers, via disappearing photos, with few safeguards. In addition, according to the 2023 Federal Human Trafficking Report, Snapchat has been identified as the leading recruitment platform for sex trafficking victims.

Snapchat has also been a crucial marketplace for illicit drugs, compounding the teen opioid crisis. Snapchat’s internal researchers found that at least 700,000 Snapchatters are exposed to drug content every single day, “in the areas that we scanned” — and that “some teens have even died as a result of buying drugs that they found through Snapchat.”

tween in green hat looks at a cell phone unhappily
Research shows that teens are especially susceptible to compulsive use of social media, says authors Johnathan Haidt and Zach Rausch. Photo: iStock

We know from the work of those attorneys general, and from investigative journalists, that Tiktok is also harming children at an industrial scale, and they’re well aware of it. We are fortunate that one of the briefs (Kentucky vs. TikTok) was posted to the internet with many sections redacted, but redacted improperly so that the hidden text could be copied out by anyone from behind the black bars.

Let’s examine a few quotations from executives at ByteDance, which owns TikTok. One executive explained that the “product in itself has baked into it compulsive use.” Another stated, “The reason kids watch TikTok is because the algo[rithm] is really good. ... But I think we need to be cognizant of what it might mean for other opportunities. And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at somebody in the eyes.”

Internal documents show that they know that teens are especially susceptible to compulsive use, with one internal report stating that minor users are “particularly sensitive to reinforcement in the form of social award,” have “minimal ability to self-regulate effectively,” and “do not have executive function to control their screen time.”

[O]pponents have raised concerns that the government will use the ‘duty of care’ as a way to censor political or ideological content that a particular administration does not like.

Internal research from TikTok also reveals that they have failed at preventing highly inappropriate content from reaching users. For example, the researchers found that 35.71 percent of “Normalization of Pedophilia” content, 33.33 percent of “Minor Sexual Solicitation” content, 39.13 percent of “Minor Physical Abuse” content, 30.36 percent of “Leading Minors Off Platform” content, 50 percent of “Glorification of Minor Sexual Assault” and 100 percent of “Fetishizing Minors” content was missed by TikTok’s content moderation process.

We know from many whistleblowers and journalists that Meta is also harming children at an industrial scale, primarily through Instagram. Arturo Bejar, a whistleblower and former Senior Engineer at Instagram, revealed internal research showing that on Instagram, about 20 percent of 13- to 15-year-olds say they were the target of bullying in the past seven days.

kids distracted on their phones
Opponents to KOSA have argued that the bill is designed to silence free speech and will not make the Internet safer for children. Photo: iStock

Bejar also revealed that 13 percent of 13- to 15-year-olds said that they have received unwanted sexual advances … in the past seven days. Bejar describes these findings like this: “Instagram hosts the largest-scale sexual harassment of teens to have ever happened.”

Other internal research, brought out by whistleblower Frances Haugen, found that in or around 2021, 6 percent of teen girls in the U.S. and 13 percent of teen girls in the U.K. “traced their desire to self-harm/commit suicide to Instagram.”

One in eight Instagram users told Meta’s researchers that they thought the platform made thoughts of suicide or self-injury worse. Not to mention the widely-known findings that Instagram “makes body image issues worse for one in three teen girls”; and that one in five teens state that Instagram makes them feel worse about themselves.

There is an active debate among researchers about whether social media use, in general, is a cause (versus a mere correlate) of the rising levels of internalizing disorders (e.g., anxiety and depression) at the “population level” (meaning: those hockey stick graphs of rising anxiety and depression in the early 2010s that we showed in Chapter 1 of “The Anxious Generation”). But when we look at the individual level, as Surgeon General Vivek Murthy has urged, the direct harm to specific children cannot be denied. Not one of the hundreds of parents whose sons and daughters died by suicide within a few days of being sextorted have mistaken correlation for causation.

It is the heroic parents of these dead and injured children who are the driving force behind KOSA.

KOSA is not a threat to free speech

KOSA is a straightforward bill that puts in place a few protections for kids online (defined as those younger than 16). Its key features include: 1) setting the strongest privacy settings for kids by default, 2) restricting addictive product features and personalized recommendation algorithms for minors, and 3) mandating that companies remove design features for minors that are known to contribute to suicide, eating disorders, substance abuse, and sexual exploitation, or that are advertisements for certain illegal products (e.g. tobacco and alcohol).

One in eight Instagram users told Meta’s researchers that they thought the platform made thoughts of suicide or self-injury worse.

This third protection has been most contested — opponents have raised concerns that the government will use the “duty of care” as a way to censor political or ideological content that a particular administration does not like. But this is a fallacy. KOSA does not limit free speech and it does not regulate content.

Here is the provision in the bill that specifically outlines this point:

“Nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude any minor from (A) deliberately and independently searching for, or specifically requesting, content; or (B) accessing resources and information regarding the prevention or mitigation of the harms described in subsection (a).”

In other words: KOSA places absolutely no restrictions on what any person can say on a social media platform — even if they want to say horrific things that promote eating disorders or suicide. KOSA also places no restrictions on what any child can search for — even if they are interested in finding horrific stuff about eating disorders or suicide. KOSA merely says, for the first time, that the platforms bear some responsibility for the content-neutral design choices they make. There are therefore no implications for free speech.

But some opponents of regulation continued to say that KOSA could, conceivably, down the road, maybe pose a threat to free speech. That’s why Elon Musk recently got involved. Musk is among the most ardent free speech advocates in the world. He owns a platform that will be covered by KOSA. He hired a CEO (Linda Yaccarino) who had been a critic of social media, and who seems genuinely to care about children’s safety.

In late November, Yaccarino and others from X engaged in negotiations with the two Senate co-sponsors, Richard Blumenthal and Marsha Blackburn, to add language that specifically says that KOSA cannot be used to censor anyone or to expose the companies to liability for what anyone posts. The new provision states:

“Nothing in this section shall be construed to allow a government entity to enforce subsection (a) based upon the viewpoint of users expressed by or through any speech, expression, or information protected by the First Amendment to the Constitution of the United States.”

But some opponents of regulation continued to say that KOSA could, conceivably, down the road, maybe pose a threat to free speech.

Yaccarino then posted on X her report on the negotiations, which is worth quoting in full:

“At X, protecting our children is our top priority. As I’ve always said, freedom of speech and safety can and must coexist. And as a mother, it’s personal.

When X testified before the Senate Judiciary Committee last January, we committed to working with Congress on child safety legislation. We’ve heard the pleas of parents and youth advocates who seek sensible guardrails across online platforms, and the Kids Online Safety Act (KOSA) addresses that need.

After working with the bill authors, I’m proud to share that we’ve made progress to further protect freedom of speech while maintaining safety for minors online. Thank you to @MarshaBlackburn and @SenBlumenthal for your leadership, dedication and collaboration on this issue and landmark legislation.

We urge Congress and the House to pass the Kids Online Safety Act this year.”

Musk immediately posted his approval of Yaccarino’s post, adding on, “Protecting kids should always be priority #1.”

young boy looks at a cell phone with a concerned expression
In June this year, Surgeon General Vivek Murthy called for social media platforms to be required to display warning labels aimed to curb use by children, such as those required of alcohol and tobacco companies. Photo: iStock

With Musk and Yaccarino now backing the revised version of KOSA, many other prominent Republicans, including Donald Trump Jr. and Sarah Huckabee Sanders, have been standing up for KOSA and calling for its passage. As Senators Blackburn and Blumenthal put it, “These changes should eliminate once and for all the false narrative that this bill would be weaponized by unelected bureaucrats to censor Americans.”

Speaker Johnson’s expressed concerns are not relevant to KOSA. The free speech protections are as explicit as possible, thanks to Musk and X.

We cannot know what Johnson’s real reasons are for blocking KOSA, but it may not be a coincidence that Meta, together with ByteDance, has spent more than $200,000 a day in the first half of 2024 to block KOSA. Meta also recently announced that it will spend ten billion dollars to build an AI facility in Louisiana, the home state of both Speaker Johnson and of House Majority Leader Steve Scalise.

Pass KOSA by Christmas

There is indisputable harm happening to children at an industrial scale — reaching literally millions of children. KOSA is a bipartisan bill that would begin to address those harms. Elon Musk and Linda Yaccarino stepped in to enshrine free speech protections in explicit language in the bill. Many tech companies now support KOSA. So did 91 Senators. So do leading Republicans and Democrats. So do most parents.

It’s time to pass KOSA. The clock is about to run out. The objections have been addressed. Pass the bill now.

More resources for families with kids from the anxious generation:

Editor’s note: This op-ed is reposted with permission from After Babel. ParentMap publishes articles, op-eds and essays by people from all walks of life. The opinions expressed in their articles are their own and are not endorsed by ParentMap.

JOIN THE PARENTMAP COMMUNITY
Get our weekly roundup of Seattle-area outings and parenting tips straight to your inbox.

Related Topics

Share this resource with your friends!