Young users’ reliance on ChatGPT raises questions over AI advice and autonomy
The growing reliance on AI for personal guidance raises questions about privacy, accuracy and the role of human decision-making support.
Sam Altman has described a generational divide in how people use ChatGPT, saying younger users are integrating the tool more deeply into learning, planning and everyday decision-making.
Speaking at Sequoia Capital’s AI Ascent 2025, the OpenAI CEO said older users tend to treat ChatGPT more like a search tool, while people in their 20s and 30s often use it as a personal advisor. College students, he said, are going further by treating ChatGPT almost like an operating system, connecting it to files, tasks and complex workflows.
The remarks point to a shift in how AI tools are being embedded into daily routines, particularly among students and younger adults. Business Insider reported that a February 2025 OpenAI report found US college students were among the platform’s most frequent users, while a Pew Research Centre survey found that 26% of US teens aged 13 to 17 used ChatGPT for schoolwork in 2024, double the share recorded in 2023.
Altman’s comments also raise questions about dependence, accuracy and boundaries as AI systems move closer to advisory roles. While users may benefit from private spaces to test ideas, organise tasks and prepare decisions, concerns remain over over-reliance, data privacy and the shifting role of human relationships in decision-making.
Why does it matter?
The trend suggests that AI is becoming more than an information tool for younger users. As ChatGPT and similar systems become part of studying, planning and personal decision-making, they influence not only how information is consumed, but also how habits, confidence and judgement develop.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our chatbot!
