KOSA is Good Tech Policy, but the House Has an Opportunity to Make it Even Better

Image of a child playing a game on a table with security warnings in the foreground
August 9, 2024

[Republished from Tech Policy Press.]

Last week the U.S. Senate overwhelmingly passed the Kids Online Safety and Privacy Act (KOSA) in a 91-3 vote. This moves the bill one step closer to becoming the first significant piece of federal legislation to address harms inflicted by online platforms on users — in this case, children. Now it’s up to the House of Representatives to pass an improved version of this historic piece of legislation, which it would need to do in the weeks between its return from recess on September 9 and the general elections on November 5 — a difficult, but not impossible, feat.

The version of KOSA that passed the Senate would regulate online platform design rather than content. The bill creates a “duty of care” for online platforms, which includes gaming sites, requiring them to take reasonable steps to prevent and mitigate specific harms to children stemming from platform use. These harms, as enumerated by the bill, include anxiety, depression, eating disorders, substance use disorders, suicidal behaviors, online bullying and harassment, the promotion of addiction-like behaviors, and sexual exploitation.

KOSA is a long overdue piece of legislation that has been thoughtfully drafted in consultation with vulnerable groups, such as the LGBTQ+ community, to avoid infringing on freedom of expression and access to information online.

But the Senate version of KOSA is not perfect, and the House has an opportunity to improve the text in at least a couple of ways. Specifically, lawmakers should address concerns about potential misapplication of the duty of care provision to content moderation choices, which could indeed have the effect of silencing disfavored opinions. The bill already provides that the duty of care pertains to the “the creation and implementation of any design feature,” but the text could explicitly state that content policies and individual moderation decisions are not considered design features. Additionally, lawmakers could hone the provision on default settings to clarify that where there is a conflict between achieving maximum privacy and maximum safety – as there sometimes is – platforms can exercise their judgment in balancing both interests.

For my full analysis of this important legislative development, please read this article on Tech Policy Press.

Related

See all