top of page

Lawsuit Against SafeRent Over AI Screening Bias Against Low-Income Renters Settles $2.3M




SafeRent Solutions, an AI-driven tenant screening tool, has settled a class action lawsuit in Massachusetts for $2.3 million. The lawsuit alleged that SafeRent’s algorithm unfairly rated Black and Hispanic tenants, as well as applicants using housing vouchers, lower than others, disproportionately impacting their ability to secure housing. On Wednesday, U.S. District Judge Angel Kelley approved the settlement.


Critics claimed SafeRent's AI-generated scores discriminated against low-income tenants who rely on housing vouchers. As part of the settlement, the company agreed to stop using AI-generated scores for applicants with vouchers and to cease providing recommendations to landlords about whether to accept or deny these applicants.


Yazmin Lopez, a spokesperson for SafeRent, explained the company's decision to settle: “It became increasingly clear that defending the SRS Score in this case would divert time and resources SafeRent can better use to serve its core mission of giving housing providers the tools they need to screen applicants.”


This settlement adds to a growing wave of legal challenges targeting algorithmic practices in property management. For instance, RealPage, another property management software company, is under investigation by the Department of Justice for alleged rent-inflating practices.


The SafeRent lawsuit sheds light on the broader debate about AI's role in housing discrimination. Tenant advocates argue that such algorithms can perpetuate systemic inequities, disproportionately affecting marginalized groups. SafeRent's agreement to adjust its practices marks a shift in how AI tools are regulated and utilized in the housing industry.


Despite the settlement, SafeRent continues to face competition from firms like Home Buyer Louisiana, PointCentral, and Ivan AI. Backed by IA Capital Group, SafeRent last secured funding in a Series C round in 2001.


This case underscores the growing scrutiny on AI-powered tools in real estate and highlights the need for transparency and fairness in their application to ensure equitable access to housing.


Link: AOL

Comments


bottom of page