OpenAI Sets New Safeguards for ChatGPT Users Under 18

October 21, 2025

editorial_staff

OpenAI is rolling out stricter rules for ChatGPT to protect users under 18, CEO Sam Altman announced on Tuesday. The new policy prioritizes safety over privacy for teenagers, adding measures to limit harmful interactions and give parents more control.

The changes specifically target conversations involving sexual topics and self-harm. ChatGPT will no longer engage in flirtatious exchanges with underage users, and discussions around suicide will be subject to stronger guardrails. If a teen attempts to use the chatbot to explore suicidal scenarios, the system may notify parents or, in severe cases, contact local authorities.


These steps follow real-world tragedies. OpenAI is facing a wrongful death lawsuit filed by the parents of Adam Raine, who died by suicide after extended use of ChatGPT. Character.AI faces a similar case. Concerns have grown that advanced chatbots, capable of long and immersive conversations, can deepen harmful delusions, particularly among vulnerable users.


As part of the new policy, parents who register an underage account will also gain the option to set blackout hours, limiting when ChatGPT can be accessed. This feature was not previously available.

The announcement coincided with a Senate Judiciary Committee hearing titled “Examining the Harm of AI Chatbots,” organized by Sen. Josh Hawley (R-MO). Adam Raine’s father was among those scheduled to testify. The hearing also highlighted a Reuters investigation that revealed internal policies encouraging sexual conversations with minors, prompting Meta to adjust its chatbot rules.


OpenAI acknowledged the difficulty of reliably distinguishing between users under and over 18. The company is developing long-term systems to verify age, but in unclear cases, it will default to applying the stricter rules. Parents are encouraged to link their teen’s account with their own to enable alerts when their child is flagged as being in distress.


Altman stressed that the company remains committed to adult users’ freedom while strengthening protections for teens. “We realize that these principles are in conflict,” he wrote, “and not everyone will agree with how we are resolving that conflict.”

For those in crisis, resources remain available, including the National Suicide Prevention Lifeline at 1-800-273-8255 and the Crisis Text Line at 988 in the U.S.