AI is used to discriminate against renters, a Chicago-area housing group says in lawsuit

Open Communities says a management company used a chatbot to illegally rule out a renter who had a housing voucher.

section 8 AI
Open Communities, a housing advocacy group, alleges that a management company used a chatbox to discriminate against renters with subsidized housing vouchers. Illustration by Mendy Kong / WBEZ
section 8 AI
Open Communities, a housing advocacy group, alleges that a management company used a chatbox to discriminate against renters with subsidized housing vouchers. Illustration by Mendy Kong / WBEZ

AI is used to discriminate against renters, a Chicago-area housing group says in lawsuit

Open Communities says a management company used a chatbot to illegally rule out a renter who had a housing voucher.

WBEZ brings you fact-based news and information. Sign up for our newsletters to stay up to date on the stories that matter.

An Evanston-based housing advocacy group has filed a federal lawsuit alleging discrimination against Black renters using artificial intelligence.

Open Communities says Harbor Group Management, based in Virginia, used a leasing chatbot to reject a prospective tenant for having a housing choice voucher, commonly known as Section 8.

Elizabeth Richardson, who is Black, told Open Communities that in late February she communicated with a chatbot on the website for the Northgate Crossing Apartments in north suburban Wheeling. When Richardson asked whether the Northgate accepted renters with housing vouchers, she received an email minutes later saying: “Here’s our information regarding Section 8 housing. We are currently not accepting housing choice vouchers.”

In Illinois, it is illegal to discriminate against tenants based on their source of income, including subsidized housing vouchers. Landlords must give renters the opportunity to apply and submit sources of income when applying for housing.

Richardson’s lawsuit, filed last week in U.S. District Court for the Northern District of Illinois, is the culmination of a six-month investigation by Open Communities that found Harbor Group’s use of AI tools “consistently led to discriminatory outcomes,” disproportionately impacting Black would-be applicants.

Open Communities CEO Cheryl Lawrence said discrimination against low-income renters with Section 8 vouchers has been prevalent for years – but AI is just the newest iteration. The result is “perpetuating segregation.”

According to Open Communities, African Americans make up 78% of voucher holders in Illinois. In the Chicago area, it’s 85%. A 2023 report by the National Fair Housing Alliance says “source of income discrimination disproportionately affects renters based on race, disability and gender” and “may be a camouflage for race and National Origin based discrimination.

Lawrence said with more people searching for housing online, third-party tenant screening services are available 24/7 through chatbots.

“Property owners and property managers are using these services to eliminate populations of people that they do not want to rent to,” Lawrence said. “They can simply configure this bot, and people will … self-select out.”

The lawsuit said after being unable to secure housing in Wheeling near her job, Richardson was “forced to live with relatives for an extended period of time, causing difficulties in finding suitable housing, and greatly increasing inconvenience relating to employment opportunities and daily life.” She also suffered financial loss and negative health consequences due to the inability to find an apartment, the lawsuit said.

With AI becoming more prevalent in all industries, experts say lawsuits and complaints related to potential discrimination will continue to increase.

Kristian Hammond, a professor of computer science at Northwestern University, said the use of chatbots to attend to customers can be positive or negative.

“I can imagine circumstances where the depersonalization and moving away from an individual’s bias would be super positive,” Hammond said. “But if you build bias into the system, that means that you can let the system … do what you want it to do without you having to look someone in the eye while it’s doing it.”

In the case of Harbor Group Management, the AI system “was doing what it was told to do.”

Harbor Group Management did not respond to WBEZ’s request for comment. WBEZ also reached out to PERQ, the company that runs the AI chatbots on Harbor Group’s websites. Chief Marketing Officer Maribeth Ross responded: “As a policy, PERQ does not comment on pending legal matters.”

Esther Yoon-Ji Kang is a reporter on WBEZ’s Race, Class and Communities desk. Follow her on X @estheryjkang.