news

FTC pursues AI regulation, bans biased algorithms

Spread the love

The FCRA

Chief among the federal laws that the FTC has designated as applying to AI algorithms is the FCRA. 

Established in 1970, the FCRA protects information collected The law bans companies from giving information about consumers to anyone without a legitimate reason to have it. It also requires credit, insurance and employment agencies to notify consumers when there is an adverse action is taken against them, such as rejecting a loan application.

More than 50 years old, the law does not directly address AI.

“When it was written in 1970, people weren’t contemplating AI,” said Peter Schildkraut, a lawyer specializing in technology at the Arnold Peter law firm.

Only in more recent years has the FTC begun to apply language and tools to AI and companies that use data extensively.

In an April 2020 guidance blog, “Using Artificial Intelligence and Algorithms,” the FTC warned businesses that use AI that the agency can use the FCRA to prevent the misuse of data and algorithms to make decisions about consumers.

In the blog, Andrew Smith, director of the FTC’s Bureau of Consumer Protection, wrote that “the use of AI tools should be transparent, explainable, fair and empirically sound.” Smith also advised organizations to validate their algorithms to ensure they are unbiased and to disclose key factors their algorithms use to assign risk scores to consumers, among other things.

Also, the FTC guidance noted that the agency also enforces the ECOA — which prohibits credit discrimination based on race, color, religion, national origin, sex, marital status, age, or because a person receives public assistance — as well the employment provision of the Civil Rights Act of 1964. Smith’s guidance extended the FTC’s authority to enforce the prohibition of AI-based discrimination against such “protected classes.”

The guidance also specified that while some companies that use data to make loans or furnish information to agencies that make decisions about consumer credit, employment, housing, and government benefits may not know or believe they need to comply with the FCRA, such companies must comply with the FCRA.

The FTC did not respond to a request for information about cases it has acted on involving misuse of AI algorithms.

However, in a 2018 action against RealPage, a real estate software company that uses software tools to match those looking for housing to criminal records, among other things, the FTC claimed RealPage didn’t take proper steps to ensure the data it provided to landlords and property managers was correct.

The FTC said that from 2012 to 2017, RealPage’s data wrongly matched some applicants to criminal records that did not belong to them. This meant some tenants were denied housing and other opportunities. The company paid the federal agency $3 million to settle the charges.