OpenAI will allow erotica conversations for age-verified ChatGPT users starting in December, as the company relaxes restrictions it implemented to address mental health concerns.
Sam Altman, CEO of OpenAI, announced the change on X as part of the company’s “treat adult users like adults” principle, reports The Verge. The move comes as OpenAI rolls out age-gating more fully across its platform.
Altman says: “We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues. We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right. Now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases.”
Earlier this month, OpenAI hinted at allowing developers to create “mature” ChatGPT apps after implementing appropriate age verification and controls. The company is not alone in this space, as Elon Musk’s xAI previously launched flirty AI companions that appear as 3D anime models in the Grok app.
OpenAI also plans to launch a new version of ChatGPT in a few weeks that behaves more like what users appreciated about GPT-4o. The company made GPT-5 the default model powering ChatGPT but brought back GPT-4o as an option after users complained the new model was less personable.
Altman wrote that the new version will allow users to have a personality that behaves more like what people liked about 4o, stating that if users want ChatGPT to respond in a very human-like way, use emoji, or act like a friend, the chatbot should do it, but only if they want it.
OpenAI has launched tools to better detect when a user is in mental distress and announced the formation of a council on wellbeing and AI to help shape the company’s response to complex or sensitive scenarios. The council comprises eight researchers and experts who study the impact of technology and AI on mental health, though it does not include any suicide prevention experts, many of whom recently called on OpenAI to roll out additional safeguards for users with suicidal thoughts.