💼 Business · Politico

AI use in housing is booming. The rules to keep it fair are shrinking. - Politico

USVInews.com User Network Contributor

The Trump administration is rolling back civil rights protections that have been used to challenge bias in housing.

President Donald Trump directed federal regulators to roll back anti-discrimination protections that civil rights advocates say are essential to keeping emerging AI technologies fair. | Alex Brandon/AP

The housing industry is rapidly adopting artificial intelligence tools to decide who gets a home loan or lease. The Trump administration is rolling back long-established protections used to keep those evaluations fair.

Technology that utilizes algorithms to predict outcomes — such as a home’s selling price or how likely someone is to afford their rent — isn’t new to the housing and lending industries. But significant improvements with user-friendly AI have made these tools more accessible for mortgage and real estate businesses, spurring increased interest in expanding the role of computerized systems in housing.

The widespread adoption of AI has renewed some hope that the technology can be more objective, reduce discriminatory bias and reverse entrenched inequalities.

But because AI models are trained with data that reflect historic patterns of discrimination, some are warning that these new systems could have the opposite effect. That fear has intensified since the executive branch began narrowing federal anti-discrimination enforcement.

“Artificial intelligence might advance civil rights if it’s used properly… but it might also reinforce discrimination in our society if we’re not careful, because AI is ingesting everything out there in the world,” Federal Reserve Governor Michael Barr, an outspoken critic of the administration’s deregulatory agenda, said at a recent fair housing advocacy event. “There’s a lot of things out there in the world that are deeply, deeply discriminatory.”

The technology developers spearheading efforts to apply AI to the housing sector say they train their systems to avoid accidental bias. But industry observers like Barr are worried that diminished government oversight will weaken incentives to continue taking those steps seriously.

Since taking office, the Trump administration has sought to prevent the federal government from enforcing rules based on “disparate impact” — a method for determining whether a practice amounts to illegal discrimination by looking at its effects on groups of people, regardless of intention. Because disparate impact methods focus on provable outcomes instead of intent, they’ve been used to challenge decisions influenced by algorithmic technology.

In 2024, a federal court approved a more than $2 million payout to rental applicants who said they were denied housing due to an algorithm that disadvantaged Black and Hispanic people. The plaintiffs used disparate impact standards to argue that the tool relied heavily on credit scores without accounting for other factors, like housing vouchers, that increased applicants’ ability to pay rent. A 2022 Urban Institute study found that white communities had median higher scores than racial minority communities.

Under the first Trump administration, the Department of Housing and Urban Development acknowledged in a 2019 rulemaking that disparate impact methods are “an important tool to root out” potential discrimination in algorithmic systems, while stressing the need for updated policies as technology evolves.

But in President Donald Trump’s second term, HUD has changed its tune, arguing that disparate impact enforcement by federal agencies was unfair to businesses and led to illegal racial preferences.

“The issue isn’t AI – it’s disparate impact, a discredited theory that requires individuals and entities to consider race on the front end to avoid legal liability on the back end. That runs counter to the core purpose of the Constitution’s Equal Protection Clause,” HUD spokesperson Robbie Myers told POLITICO. “HUD will continue to hold bad actors accountable under the Fair Housing Act.”

HUD and the Consumer Financial Protection Bureau have proposed rules to roll back their disparate impact protections.

“One Hallmark of the second Trump administration has been the eradication of discriminatory raced based policies that have permeated every aspect of government under the banner of ‘diversity equity and inclusion’,” said Rachel Cauley, a spokesperson for the Office of Management and Budget who is also acting as spokesperson for the bureau, in a statement. “The administration of fair lending laws is no exception.”

This article is republished through the USVI News affiliate desk. Reporting, analysis, and viewpoints are those of the original publisher and do not necessarily reflect USVI News.

Read more at Politico

Politico image for AI use in housing is booming. The rules to keep it fair are shrinking. - Politico