Friday, December 13, 2024

Governor Gavin Newsom has vetoed Senate Bill 1047, a piece of legislation aimed at preventing AI-related disasters.

California Gov. California Governor Gavin Newsom has rejected Senate Bill 1047, a measure intended to prevent harmful entities from employing artificial intelligence to cause “substantial harm” to individuals. The California State Legislature passed the laws by a resounding margin of 41 to 9 on August 28. Despite opposition from various organisations, including the Chamber of Commerce, which had previously expressed concerns over certain aspects of the legislation. In his on Sept. Governor Gavin Newsom characterized the proposed invoice as “well-intentioned,” yet noted that it neglects to consider crucial factors: whether an AI system is deployed in high-risk settings, requires critical decision-making, or handles sensitive data? While serving as a substitute, the invoice rigorously scrutinizes even its fundamental components – as soon as a comprehensive system incorporates it.

Proposed legislation SB 1047 aimed to hold accountable the creators of AI fashion technologies by mandating the implementation of security measures capable of preventing devastating misuse of their innovations. The feature set includes proactive safeguards aligned with risk assessment and outdoor hazard evaluation, alongside a comprehensive “emergency shutdown” capability capable of fully terminating the AI model’s operation. A primary violation would incur a minimum penalty of $10 million to $30 million for successive transgressions. Notwithstanding this, the revised invoice aims to eliminate the State Attorney General’s ability to bring lawsuits against AI companies exhibiting negligent practices unless a catastrophic event actually occurs. Corporations are subject to injunctive relief and may face liability if their model causes significant harm.

This regulation stipulates the application of specific criteria to AI models with a valuation of at least $100 million, necessitating the utilization of 10^26 FLOPS during training periods. In such instances, it would also establish derivative initiatives where third-party investors with investments of at least $10 million in developing or refining the original model are involved. Any firm conducting business in California may be subject to these principles if it meets the necessary requirements. Governor Newsom focused on the invoice’s emphasis on large-scale programs, stating, “I do not envision this approach as a suitable means of safeguarding the public from the genuine threats posed by emerging technologies.” His veto message noted that

While prioritizing massive expenditures on fashion, Senate Bill 1047 creates a comprehensive regulatory structure that might inadvertently create an illusion of security regarding the rapid evolution of technology, potentially misleading the broader public. As smaller, niche fashion trends potentially gain prominence, they may pose a risk equal to or even greater than those addressed by SB 1047, thereby stifling innovation and progress in pursuit of the greater public benefit.

The initial proposal for SB 1047 envisioned establishing a novel organizational unit, dubbed the Frontier Model Division, responsible for overseeing and executing the guidelines. The invoice was amended prior to committee consideration, shifting authority to govern from the committee’s hands to those of the Board of Directors within the Authority’s Operations Company. Nine members of the commission could be appointed by the state’s governor in conjunction with the legislature.

What was once owed will now be settled with precision and finality. California State Senator SB 1047 was authored by San Francisco Supervisor Scott Wiener cautioned that the city’s reliance on data-driven approaches has created a “historical precedent” where authorities prepare for potential dangers and then express regret when calamities unfold. Let’s refrain from dwelling on potential risks that could unfold. “Rather than merely reacting to the situation, let’s proactively address it.” Renowned AI experts, including Yoshua Bengio, have endorsed the proposed regulations, alongside the Centre for AI Safety, a long-standing advocate for mitigating the risks associated with AI.

“The Governor emphasized that it’s crucial to take proactive measures to protect the public before a catastrophic event occurs, aligning himself with the original proposal.” The assertion continues:

California remains steadfast in upholding its responsibilities. Effective security measures must be implemented promptly. Proactive safeguards must be established, and severe consequences for reckless behavior must be transparently defined and consistently enforced. While I understand your skepticism, I firmly believe that it is crucial to ensure the general public’s protection through rigorous evaluations of AI programs and their capabilities using empirically grounded methods, rather than relying solely on anecdotal or speculative assessments. To effectively regulate AI, a framework must stay in sync with the rapidly evolving technology itself?

The California Senate Bill 1047 faced strong resistance from the technology sector. Leading researcher Fei-Fei Li and Meta’s Chief AI Scientist Yann LeCun are concerned that current limitations in AI may hinder our ability to uncover novel applications of artificial intelligence. A California legislative proposal, SB 1047, backed by a coalition of corporate representatives from prominent technology companies, including those corresponding to Amazon, Apple, and Google, aims to impose limitations on future development in the state’s thriving tech sector. Andreesen Horowitz, along with several startups, has raised concerns that the bill imposes unnecessary financial burdens on AI innovators. As the unique invoice faced opposition from anthropic and diverse groups, proponents sought to amend SB 1047 to align with California’s Appropriations Committee, which passed on August 15.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles