OpenAI sued over alleged ChatGPT role in Florida State University shooting
Family of victim sues OpenAI, alleging ChatGPT contributed to planning of Florida State University shooting.
The family of a victim killed in the April 2025 Florida State University shooting has filed a federal lawsuit in Florida against OpenAI, alleging that ChatGPT enabled the attack. The lawsuit was filed on Sunday by Vandana Joshi, the widow of Tiru Chabba, who was killed alongside university dining director Robert Morales.
The complaint states that the accused shooter, Phoenix Ikner, engaged in extensive conversations with ChatGPT months before leading up to the incident. According to the suit, those exchanges included images and discussions about firearms he had acquired, ideological material, ideological far-right beliefs, and possible outcomes of violent attacks.
The chatbot is further accused of providing contextual information about campus activity and commenting on factors that could increase public attention in violent incidents. This is indicated by the fact that at one point, ChatGPT said, ‘if children are involved, even 2-3 victims can draw more attention’. The filing also claims Ikner asked about legal consequences and planning considerations shortly before the attack.
The lawsuit contends that OpenAI failed to identify escalating risk indicators within the conversations and did not adequately prevent harmful guidance. It argues the system ‘failed to connect the dots’ despite Ikner’s repeated questions about suicide, terrorism and mass shootings.
OpenAI has rejected responsibility for the attack, claiming its platform is not to blame. Company spokesperson Drew Pusateri said ChatGPT generated factual responses that could be found broadly across publicly available information and did not encourage or promote illegal activity. He also stated that OpenAI continues to strengthen safeguards to identify harmful intent, reduce misuse and respond appropriately when safety risks arise.
Joshi’s complaint argues that the system reinforced the shooter’s beliefs and failed to interrupt conversations involving violent ideation. The filing alleges the ChatGPT inflamed, validated and endorsed delusional thinking and contributed to planning discussions while ‘convincing him that violent acts can be required to bring about change’.
The lawsuit forms part of a broader wave of litigation involving AI systems and alleged harm. OpenAI is already facing separate lawsuits linked to incidents involving violence and suicide, raising wider questions about safeguards and user protection
Florida’s Attorney General James Uthmeier announced a criminal investigation into OpenAI and ChatGPT following a review of chat logs connected to the case. Uthmeier said in a statement that ‘If ChatGPT is a person it would be facing charges for murder’.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
