AI chatbots face age assurance clampdown prompted by teen suicide, psychosis
📄 Article Content
# AI chatbots face age assurance clampdown prompted by teen suicide, psychosis
Big Tech lobby already denouncing law that would stop AI from recommending nooses
Oct 31, 2025, 12:09 pm EDT | Joel R. McConvey
Categories Age Assurance | Biometrics News
What are nightmares made of? Halloween lore would say vampires and werewolves and witches and ghosts. But it turns out the ghoulies lurking in our homes aren’t movie monsters. In fact, they may be large language models (LLMS) deployed as AI chatbots.
As generative AI tech has swept the globe, powered by relentless evangelizing from its developers and investors, so too has a new sickness descended. AI is driving people insane: this week, OpenAI disclosed that more than a million people a week display suicidal intent when conversing with ChatGPT and that hundreds of thousands have signs of psychosis. And the tally is growing of parents who place the blame for their children’s suicides on the shoulders of chatbots.
Regulators are taking note. This month, Canada’s AI minister publicly pondered age checks for AI chatbots, and U.S. states are already flexing legislation. Now, the issue has reached America’s federal government.
## Hawley bill aims to stop AI from ‘breaking’ children
U.S. legislators are moving quickly on regulations for large language model (LLM) chatbots, as the tech marketed as the future of the global economy continues to show its capacity to send people to the grave.
Missouri Senator Josh Hawley introduced S3062, “A bill to require artificial intelligence chatbots to implement age verification measures and make certain disclosures, and for other purposes,” leading a group of senators who hope to curb chatbots’ abilities to have sexually explicit conversations with children, or counsel them to kill themselves or others.
According to a report from Roll Call, the bipartisan bill would enshrine criminal penalties for companies allowing chatbots to engage in the prohibited conversations with kids, and