California governor vetoes major AI safety bill

By

California governor vetoes AI safety bill
California's bill tried to put some teeth into AI safety regulations. It won't be the last try.
Photo: Cult of Mac Deals

On Sunday California Gov. Gavin Newsom vetoed Senate Bill 1047, a set of controversial artificial intelligence safety regulations with several mandates for companies, objecting to its approach. So the state’s many AI players, including Apple, won’t have to change how they work or face potential penalties because of that particular legislation.

But despite leaving SB 1047 unsigned, Newswom said he does believe in the need for AI safety regulation.

California governor vetoes AI safety bill

Newsom vetoed SB 1047 Sunday by returning the bill unsigned to the California State Senate, with a letter of explanation. The bill — The Safe and Secure Innovation for Frontier Artificial Intelligence Models Act — landed on Newsom’s desk after passing the senate under lead authorship from Sen. Scott Wiener (D-San Francisco) in late August. Had it become law with Newsom’s signature, SB 1047 would have likely influenced Apple Intelligence‘s development and implementation. Apple’s ongoing AI push debuts with the upcoming iOS 18.1 and macOS Sequoia 15.1 releases. The new AI features require an iPhone 15 Pro or later, or iPads and Macs with the M1 chip or newer.

But even with SB 1047’s failure, the nascent AI industry knows regulation is coming. But so far there appears to be no comment from Apple on SB 1047 or AI safety regulation in general.

Various reasons for veto

Gov. Newsom cited various reasons for vetoing the legislation, including the burden it places on companies, as well as its broadness:

While well-intentioned, SB 1047 does not take into account whether an AI system is deployed in high-risk environments, involves critical decision-making or the use of sensitive data. Instead, the bill applies stringent standards to even the most basic functions — so long as a large system deploys it. I do not believe this is the best approach to protecting the public from real threats posed by the technology.

And he said SB 1047 could dampen innovation. “Smaller, specialized models may emerge as equally or even more dangerous than the models targeted by SB 1047 — at the potential expense of curtailing the very innovation that fuels advancement in favor of the public good,” Newsom wrote.

Still a need for regulation of AI

Newsom said he thinks guardrails need to be in place, including consequences for companies or other bad actors running afoul of future regulations. But he doesn’t think the state should “settle for a solution that is not informed by an empirical trajectory analysis of Al systems and capabilities.”

For his part, SB 1047 lead authorWeiner called the veto a setback in a post on X (formerly Twitter).

“This veto leaves us with the troubling reality that companies aiming to create an extremely powerful technology face no binding restrictions from U.S. policymakers, particularly given Congress’s continuing paralysis around regulating the tech industry in any meaningful way,” he wrote.

What was in SB 1047?

Despite opposition from many in the tech industry, such as OpenAI, the bill enjoyed broad bipartisan support. It came along after the Biden administration’s AI guidelines that Apple and other tech companies pledged to follow, but the new bill contained more detail and included enforceable mandates. In other words, it had some teeth the White House guidelines lack.

SB 1047 focused on regulating sophisticated AI models, potentially affecting future AI features on Macs and other devices. It required AI developers to implement safety testing for advanced AI models that cost more than $100 million to develop, or those that require a defined amount of computing power. And companies must show they can quickly shut down unsafe models and protect against harmful modifications.

Further, the bill gave the state attorney general the power to sue if developers who don’t comply with the rules. It included protective measures for whistleblowers who point out AI dangers and it mandated that developers hire third-party auditors to assess their safety precautions.

 

 

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.