Join us Read
Listen
Watch
Book
The 100-Year Life Health Education and Government

California lawmakers target AI ‘companionship’ chatbots

Last year, Megan Garcia filed a lawsuit against the artificial intelligence company Character.ai, alleging that her son’s AI girlfriend contributed to his death by suicide.

Now she’s pushing for legislation that can safeguard children from so-called AI “companionship” chatbots, which can act as friends, lovers and therapists.

Character.ai’s users, many of them children, chat to bots based on both fictional characters and real people.

The platform says it receives 20,000 queries per second – about a fifth of the volume of queries that Google search handles per second.

Garcia met with California senator Steve Padilla on Tuesday, telling a press conference the bots are “inherently dangerous” and can encourage inappropriate sexual conversation or self-harm.

They can also be highly addictive.

To address the problem, Padilla proposed a bill that forces chatbot platforms to introduce a number of safety measures, including protocols for suicidal ideation.

It will join a similar California bill that aims to ban AI companions for under-16s, and another in New York.

Even if Padilla’s bill is passed, the legislation may still not go far enough.

Further listening: My AI girlfriend: a cure for loneliness


Enjoyed this article?

Sign up to the Daily Sensemaker Newsletter

A free newsletter from Tortoise. Take once a day for greater clarity.



Tortoise logo

A free newsletter from Tortoise. Take once a day for greater clarity.



Tortoise logo

Download the Tortoise App

Download the free Tortoise app to read the Daily Sensemaker and listen to all our audio stories and investigations in high-fidelity.

App Store Google Play Store

Follow:


Copyright © 2025 Tortoise Media

All Rights Reserved