California Gov. Gavin Newsom vetoed a bill on Sunday that sought to ward off catastrophic risks of highly advanced artificial intelligence models.
The bill, SB 1047, was the most controversial AI bill of the legislative session, with dozens of activists and AI companies lining up on either side. The authors warned that if left unregulated, AI models could be used to develop chemical or nuclear weapons, which could lead to mass casualties. Opponents argued that heavy-handed regulation would strangle the development of AI, and force AI companies to leave the state.
In a veto message, Newsom said the bill is addressing a genuine problem, but does does not establish the right regulatory framework.
“I do not believe this is the best approach to protecting the public from real threats posed by the technology,” he wrote. “Ultimately, any framework for effectively regulating Al needs to keep pace with the technology itself.”
Popular on Variety
Among the supporters of the bill was SAG-AFTRA, the union that represents Hollywood actors. A group called “Artists for Safe AI” also issued an open letter supporting the bill on Tuesday, with support from J.J. Abrams, Shonda Rhimes, Judd Apatow, Rob Reiner, Jane Fonda, Rian Johnson, Adam McKay, Mark Hamill, Mark Ruffalo, Don Cheadle and others.
The actors union has been outspoken on the threat of AI to clone actors and put them in movies or TV shows without their consent. This is the first time the union has weighed in on the bigger-picture risks, outside the entertainment realm, that could come from highly advanced AI models.
“It really stems from the fact we have experienced firsthand the dangers of one aspect of AI,” said Jeffrey Bennett, general counsel of the union. “This bill seems to be the one bill that targets only the incredibly powerful expensive systems that have the capability to cause a mass critical problem. Why not regulate at that level? Why not build in some sensible, basic safety protocols at this stage of the game?”
While announcing the veto on Sunday, Newsom said he would also convene experts to develop regulations to promote the safe development of AI, and that he would continue to work on the issue next year.
SAG-AFTRA supported two other AI bills in California this year, which regulate the use of AI in the entertainment context. Newsom signed both bills at SAG-AFTRA headquarters earlier this month.
The union did not get involved in the debate over SB 1047 while it was pending in the Legislature. But on Sept. 9, the union sent a letter to Newsom urging him to sign the bill.
“AI assisted deepfake technology has been utilized to create fake nude and pornographic images of SAG-AFTRA members,” wrote Shane Gusman, the union’s lobbyist in Sacramento. “In our view, policy makers have a responsibility to step in and protect our members and the public. SB 1047 is a measured first step to get us there.”
Other Hollywood unions and companies did not get involved on the issue, which addressed only “frontier” AI models that do not currently exist.
Newsom signed another AI bill, AB 2013, which requires AI developers to disclose whether they are training their models on copyrighted work.
The Concept Art Association was a key supporter of that bill. The group represents artists who create the visual ideas for movies, animation and video games. In recent years, many have seen their work scooped into AI models, which can be used to supplant their jobs.
The bill does not require developers to reveal the entire data set that was used to train their models. Nor does it require developers to pay for using copyrighted works — a highly contested issue that is still being litigated. It simply requires developers to acknowledge the use of copyrighted data, or other “personal” information.
“Any disclosure we can get is a good thing,” said Deana Igelsrud, legislative and policy advocate for the Concept Art Association. “It’s very general, but it’s a start.”
The Hollywood unions have backed a similar bill in Congress, which was proposed by Rep. Adam Schiff earlier this year.
“None of these AI systems would be able to output anything if they weren’t filled with all the art in the history of the world,” Igelsrud said. “I don’t think people really understand that real human beings are attached to the data. Everyone just assumes if you put it on the internet, it’s a free for all. It’s not.”