Elizabeth Richardson thought she had found the ideal place to live.
When she applied online in February, Richardson said she would use a federal Housing Choice Voucher (Section 8) to help pay the rent.
But according to a federal lawsuit filed on her behalf this week by Evanston-based Open Communities, Richardson, who is Black, received an artificial-intelligence-generated response stating “We are currently not accepting housing choice vouchers.”
The lawsuit says that the management company. Norfolk, Virginia-based Harbor Group Management‘s “conduct is particularly egregious in that it uses Artificial Intelligence entities posing as Leasing Assistants to auto-reject applicants based on pre-programmed inadmissible criteria ….”
In this case, it was the chat bot flagging a term like “housing choice voucher” or “Section 8” on the application, and automatically rejecting the potential tenant.
And that rejection, Open Communities maintains, is a violation of both federal and state fair housing laws.
The apartment complex, Northgate Crossing, is in Wheeling. Open Communities deals with alleged housing law violations throughout the northern suburbs.
The lawsuit says that after hearing from Richardson, Open Communities investigated more than 100 locations nationwide managed by Harbor Group or its affiliates, and discovered the same, AI-generated response, a “blanket no-voucher policy.”
The apartment application does not ask for a person’s race. That would be blatantly illegal.
So how can racial discrimination be one of the grounds for alleged violation of the law?
The suit says that because voucher users are disproportionately Black, rejecting them based on Section 8 status ends up being racially-oriented discrimination.
The litigation cites a federal study that concluded “policies refusing to rent based on source of income [in this case, housing choice vouchers] may be ‘facially neutral,’ but may have the effect of discriminating based on race, because more Black citizens are HCV holders.”
Source of income is not a specifically protected class under the federal Fair Housing Act. But if there is a racial connection, that could be grounds for a suit.
According to the Northwest Fair Housing Alliance, “If a type of income or subsidy is inextricably related to a protected class, or disproportionately received by a protected class [race] … denial of that source of income or subsidy by a housing provider can give rise to a Fair Housing Act (FHA) violation.”
But even if the Richardson lawsuit does not succeed on federal grounds, it also takes a second approach.
As of Jan.1 of this year, source of income is protected under Illinois law. And the National Multi-Family Housing Council says many other states and municipalities have similar laws.
So besides citing alleged federal law violations, the lawsuit also includes Illinois’ source of income protection.
The suit claims that as a result of being denied an apartment at Northgate Crossing, Richardson suffered financial damages while searching for another dwelling, and also “felt discriminatorily stigmatized, devalued, discouraged, deterred and rejected from seeking housing, based on her status in protected classes.”
The suit asks for whatever actual financial damages are allowed by law, as well as punitive damages.
Evanston Now contacted Harbor Management, and received an emailed response that the company is aware of the lawsuit but does not comment on pending litigation.
We also contacted the Artificial Intelligence provider whose product, according to the suit, was used by the management company/companies, PERQ Marketing, of Indianapolis, but have not received a response. (Update: In an emailed response received Oct. 2, a company spokesperson wrote: “As a policy, PERQ does not comment on pending legal matters.”)
We also reached out to Open Communities, but an official told us via email that the organization will stick with what’s said in the lawsuit and not do any interviews.
But the attorney who filed the suit for Elizabeth Richardson and Open Communities, Jennifer Soule, said the case is “potentially groundbreaking.”
She said, “We’re hearing a lot about AI and algorithms embedding racial bias, and this is a real-life example of software being pre-programmed” and doing just that.
She also said that AI can be made to seem conversational, so the potential renter may feel like they are communicating with a real person.
Soule said the lawsuit is trying to stop the practice of blanket denials, which, when automatically generated by AI, “amounts to wholesale exclusion of people with housing vouchers, who are well-known to be predominantly Black.”
The Open Communities lawsuit says that after a while, it appeared that the management company figured out that they were being investigated, and in September, Open Communities “received a recently crafted and radically different response” from a Texas apartment complex which had previously rejected a test applicant.
The newer response said that “‘we provide housing opportunities to all prospective residents who meet our screening criteria, regardless of their sources of funds.'”
However, despite this apparent change, the lawsuit says Richardson has not heard back “from HGMC, PERQ, or any defendant of any kind, “correcting or retracting the … rejection of housing choice vouchers or inviting her to reapply.”