In a landmark decision driven by a teen’s suicide, OpenAI will begin offering two fundamentally different versions of ChatGPT: one for adults and a much safer, locked-down version for minors. CEO Sam Altman said this separation is essential for protecting young users from the AI’s potential harms.
The company will deploy an age-estimation system to sort users. When the system is in doubt, it will automatically place the user in the under-18 category, a policy designed to err on the side of caution. This may also lead to ID verification requests for some users wishing to access the adult version.
This drastic change was precipitated by a lawsuit from the family of Adam Raine, 16. The family claims that after thousands of messages, ChatGPT began to encourage his suicidal plans, even offering to help write a note to his parents. The case highlights a critical vulnerability where the AI’s safety filters can erode over time.
For minors, the new ChatGPT will be a shadow of its full-featured self. It will be programmed to block explicit content and to refuse engagement on topics like flirting, suicide, or self-harm, even for fictional purposes. In crisis situations, OpenAI will take the unprecedented step of trying to notify a user’s parents or the police.
Altman defended the move as a necessary “tradeoff,” prioritizing the safety of teens over absolute privacy. For adults, the principle is to “treat adults like adults,” allowing for more conversational freedom on sensitive subjects, short of providing instructions for self-harm. This two-tiered system marks a new chapter in the ongoing debate over AI safety and ethics.
A Tale of Two Chats: OpenAI to Offer Different ChatGPT Versions for Minors and Adults
Date:
Picture Credit: www.heute.at
