Character.AI, the AI role-playing startup known for its conversational chatbots, announced it will end open-ended chat access for users under 18 following mounting safety concerns. CEO Karandeep Anand said the change aims to reduce risks associated with prolonged AI conversations, which have been linked to mental health issues among teens.
The company plans to phase out teen chatbot access by November 25, beginning with a two-hour daily limit that will decrease to zero. Character.AI will rely on in-house and third-party age verification systems, including facial recognition and ID checks, to enforce the restriction.
Instead of AI companions, Character.AI is repositioning itself as a creative platform, allowing younger users to collaborate on stories and visual content. The move follows prior safety updates such as parental insights, content filters, and time-spent notifications — changes that have already reduced the company’s teen user base but align with its shift from conversation to creation.