Get the latest tech news
She didn’t get an apartment because of an AI-generated score – and sued to help others avoid the same fate
Despite a stellar reference from a landlord of 17 years, Mary Louis was rejected after being screened by firm SafeRent
Louis and the other named plaintiff, Monica Douglas, alleged the company’s algorithm disproportionately scored Black and Hispanic renters who use housing vouchers lower than white applicants. They alleged the software inaccurately weighed irrelevant account information about whether they’d be good tenants – credit scores, non-housing related debt – but did not factor in that they’d be using a housing voucher. “It became increasingly clear that defending the SRS Score in this case would divert time and resources SafeRent can better use to serve its core mission of giving housing providers the tools they need to screen applicants.”
Or read this on r/technology