The attorneys general of California and Delaware on Friday warned OpenAI they have “serious concerns” about the safety of its flagship chatbot, ChatGPT, especially for children and teens.
The two state officials, who have unique powers to regulate nonprofits such as OpenAI, sent the letter to the company after a meeting with its legal team earlier this week in Wilmington, Delaware.
California AG Rob Bonta and Delaware AG Kathleen Jennings have spent months reviewing OpenAI’s plans to restructure its business, with an eye on “ensuring rigorous and robust oversight of OpenAI’s safety mission.”
But they said they were concerned by “deeply troubling reports of dangerous interactions between” chatbots and their users, including the “heartbreaking death by suicide of one young Californian after he had prolonged interactions with an OpenAI chatbot, as well as a similarly disturbing murder-suicide in Connecticut. Whatever safeguards were in place did not work.”
The parents of the 16-year-old California boy, who died in April, sued OpenAI and its CEO, Sam Altman, last month.
OpenAI didn’t immediately respond to a request for comment on Friday.
Founded as a nonprofit with a safety-focused mission to build better-than-human artificial intelligence, OpenAI had recently sought to transfer more control to its for-profit arm from its nonprofit before dropping those plans in May after discussions with the offices of Bonta and Jennings and other nonprofit groups.
The two elected officials, both Democrats, have oversight of any such changes because OpenAI is incorporated in Delaware and operates out of California, where it has its headquarters in San Francisco.
After dropping its initial plans, OpenAI has been seeking the officials’ approval for a “recapitalization,” in which the nonprofit’s existing for-profit arm will convert into a public benefit corporation that has to consider the interests of both shareholders and the mission.
Bonta and Jennings wrote Friday of their “shared view” that OpenAI and the industry need better safety measures.