Subscribe for notification
AI

AI Bills Aim Well But May Lead To Burdensome Results

Lawyer Dina Blikshteyn warns that California’s AI bills, though well-intentioned, may burden small developers, even as larger firms enhance model capabilities.

Models for artificial intelligence (AI) are developing remarkably quickly.

Large-scale developers are making great progress toward improving these models’ comprehension of intricate queries and their capacity to provide more perceptive, well-reasoned answers.

This was emphasized in a Sept. 12 statement about their new “Strawberry” model from OpenAI, the company behind the well-known ChatGPT model.

This innovation dubbed the OpenAI model series, will enable users to consider issues more thoroughly before reacting, “much like a person would.”

The inventor claims that the models will also be able to “refine their thinking process, try different strategies, and recognize their mistakes.”

Legislators are concerned about their ability to regulate AI models if they go awry and add safety precautions during the developmental stages, even though AI isn’t taking over the world and isn’t the goal of those developing the technology.

Bills On The Table

Legislators in California have been passing laws about AI bills that have had an impact on both developers and citizens of the state during the last week.

Assembly Bill 1836, which safeguards the rights and likenesses of performers, forbids the unapproved use of AI to create duplicates of deceased people without their prior consent.

However, Senate Bill (SB)-1047, commonly referred to as the “Safe and Secure Innovation for Frontier Artificial Intelligence Models Act,” is one of the key bills that insiders in the field are arguing over.

If the measure is approved, it will mostly affect large AI developers, such as OpenAI, Google, and Microsoft, who have the financial means to create AI models that cost more than $100 million and require more than 10^26 integer or floating-point operations (FLOPs).

To implement the safety elements included in the AI bills, developers will need to train and fine-tune the models.

This covers the ability to shut down AI models, write and maintain a safety protocol, make sure that yearly third-party audits occur, and send incident reports and compliance statements to California’s attorney general as outlined in the AI bills.

Developers of all sizes in the business are opposing the law, arguing that it stifles innovation. To find out how that can occur, Cointelegraph spoke with Lawyer Dina Blikshteyn, a partner at Haynes Boone.

Impact on Developers

According to Lawyer Dina Blikshteyn, AI bills might also apply to tiny developers who can afford a $10 million access fee and are refining AI models with computational power more than or equal to three times 10^25 integers, or FLOP.

“By implementing shutdown capabilities, AI bills aim to prevent disasters caused by AI models,” the spokesperson stated.

“However, it may not fully eliminate risks, as an AI model could trigger a chain reaction with harmful consequences even after shutdown.”

She added that:

“While the bill’s intent is positive, the requirements for safety protocols, audits, and compliance reports might be seen as excessive, potentially imposing burdensome disclosure and bureaucratic demands that could hinder innovation in California’s AI industry.”

As of right now, there is no federal structure in the US in place to control AI model outputs. Blikshteyn draws attention to the fact that states like Colorado and California are passing their laws.

Californians who use the covered AI models for training and access would be impacted by the restrictions on Governor Gavin Newsom’s desk.

She noted, “The larger AI companies would have more manpower to handle the bill’s requirements, which could be considered a drain on the resources of the smaller company.”

“While large AI companies are unlikely to leave California, the variation in state laws and lack of federal oversight could push smaller developers to relocate or conduct AI work in states with fewer regulations on AI governance.”

California Leads Legislation

However, Blikshteyn emphasizes what a lot of people in the business believe to be true: “Federal legislation that establishes fundamental standards for potent AI models would be advantageous for both users and creators.

Additionally, it would give all states a starting point for determining what those standards are.

Governor Newsom received SB-1047 on September 9; a decision is still pending. Speaking on the measure, Newsom stated that he has been developing “rational regulation that supports risk-taking, but not recklessness.”

He has, meanwhile, also voiced worries about the possible effects on competition.

The entire world is closely following California’s legislative choices surrounding artificial intelligence (AI) because the state is a global leader in digital innovation.

Bunmi Esther

Disqus Comments Loading...

Recent Posts

Ripple CEO Predicts Future of Crypto in US

In a tweet, Brad Garlinghouse, the CEO of Ripple, made a prediction regarding the future influence of the United States…

10 minutes ago

Bitcoin Miner Wins Battle to Keep Mining in New York

Bitcoin miner Greenidge Generation will remain open in New York after winning a lawsuit against the State’s Department of Environmental…

29 minutes ago

Bluesky Woos Swifties

Since Donald Trump's election win, Bluesky has gained 2 million users, or 15%, attracting both disillusioned left-leaning X users and…

2 hours ago

SEC Chief Gensler Hints at Resignation Amid Crypto Crackdown

SEC Chief Gary Gensler defended his crypto enforcement record in a Thursday speech and hinted at a possible departure from…

3 hours ago

EU Regulator Tightens Rules for Crypto Providers

To comply with the EU's restrictive measures regimes, crypto-asset service providers that conduct transfers must select a screening method. Two…

3 hours ago

South Korea Investigates Upbit for 600K KYC Violations

South Korea Upbit faces scrutiny for alleged KYC violations, with potential fines of $71,500 per case. According to reports, Upbit,…

4 hours ago