news

AI systems need Energy Star-like rating, says policy expert

Spread the love

Setting guardrails, incentivizing due diligence

Not all AI algorithms and systems are bad, which means policymakers should identify the most sensitive, high risk areas that AI could impact and focus regulatory efforts there, Turner Lee said.

For example, guardrails should be set for AI systems within financial products that make determinations about loans or mortgages for consumers, or to ensure consumer privacy isn’t violated “Policymakers should define the guardrails,” she said.

It’s also “incumbent upon developers to do their due diligence” as they create AI systems, Turner Lee said.

Data scientists should be responsible and accountable for their AI systems, as well as completing all the testing on a product, and embedding the right auditing tools.

“That is not the responsibility of government, that is the responsibility of the scientist or the agency or corporation licensing the autonomous system,” she said.

Policymakers have a role to play when it comes to that due diligence Indeed, Turner Lee suggested that a rating system comparable to Energy Star ratings given to appliances should be created for AI systems so consumers know that the AI system meets federally-mandated standards. Energy Star ratings show that an appliance meets federally mandated guidelines for energy efficiency.

“What we can do is create transparency and we can produce a culture of excellence that allows us to have these components in place,” she said.

Turner Lee said policymakers and data scientists should work together to define the rules of the road for AI.

“You need people working together and collaborating on what those principles are,” she said. 

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.