OpenAI to improve its ability in detecting mental or emotional distress
Following the growing reports of people using ChatGPT during moments of mental health crisis, OpenAI announced that it plans to collaborate with mental health professionals in the development of its forthcoming GPT-5 model.

In search of emotional support during a mental health crisis, it has been reported that people use ChatGPT as their ‘therapist.’ While this may seem like an easy getaway, reports have shown that ChatGPT’s responses have had an amplifying effect on people’s delusions rather than helping them find coping mechanisms. As a result, OpenAI stated that it plans to improve the chatbot’s ability to detect mental distress in the new GPT-5 AI model, which is expected to launch later this week.
OpenAI admits that GPT-4 sometimes failed to recognise signs of delusion or emotional dependency, especially in vulnerable users. To encourage healthier use of ChatGPT, which now serves nearly 700 million weekly users, OpenAI is introducing break reminders during long sessions, prompting users to pause or continue chatting.
Additionally, it plans to refine how and when ChatGPT displays break reminders, following a trend seen on platforms like YouTube and TikTok.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!