Eight people were killed, including six children, when 18-year-old Jessie Van Rootselaar opened fire at a secondary school in the Tumbler Ridge, British Columbia, in February, News.Az reports, citing foreign media.
***
Media reports have since revealed that Van Rootselaar’s ChatGPT activity was flagged by OpenAI’s safety team months before the attack for references to gun violence, but the company did not alert local police.
Last week, Altman apologised to families of the victims.
“I am deeply sorry that we did not alert law enforcement,” Altman wrote in an open letter published by local news outlet Tumbler RidgeLines.
“While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered.”
In a statement responding to the lawsuits, an OpenAI spokesperson said the company has “a zero-tolerance policy for using our tools to assist in committing violence.”
The spokesperson added that OpenAI had “already strengthened our safeguards”, including better assessment and escalation of “potential threats of violence.”
The company also published a blog on Tuesday outlining how OpenAI responds to users who display potentially dangerous behaviour on ChatGPT.The new legal actions were filed in a California court on Wednesday by a joint legal team from the US and Canada.
It will replace a previous lawsuit filed in a Canadian court by the family of one surviving victim, 12-year-old Maya Gebala, which is being voluntarily withdrawn.
Gebala remains in hospital after being shot three times, in the head, neck and cheek.
Jay Edelson, the lawyer representing the families and community members in the new lawsuits, said he expects to file more than two dozen legal actions on behalf of Tumbler Ridge victims and community members against OpenAI.
He added he will be requesting trials by jury in each case.
“We feel very comfortable making a case in front of a jury,” he told the BBC.
The lawsuits accuse OpenAI and its senior leadership, including Altman, of negligence and aiding and abetting the Tumbler Ridge mass shooting by failing to alert law enforcement of the suspect’s ChatGPT activities prior to the attack.
One lawsuit naming Gebala and her family alleges that that OpenAI “had actual knowledge” of the shooter’s intention to carry out an attack through conversations with ChatGPT, where the shooter described “scenarios involving gun violence”.
The conversations were flagged by a 12-person safety team at OpenAI, who recommended that the suspect be reported to the Royal Canadian Mounted Police (RCMP), Edelson said.
Executive leadership at OpenAI, however, vetoed that decision, the lawsuit alleges.
It further alleges that OpenAI’s senior leadership made the call not to alert police in order to protect the valuation and reputation of the $850bn (£630bn) company.
“They did the math and decided that the safety of the children of Tumbler Ridge was an acceptable risk,” the lawsuit states.
It also alleges that OpenAI lied about the suspect being banned from the platform after the troubling activity was flagged, arguing that the company makes it easy for users to create new accounts.
The suspect, the lawsuit states, made another account under the same name and “continued using ChatGPT to plan the attack”.
29
Apr


