Automated housing discrimination
In recent years, most landlords and property managers use automated reports when researching prospective tenants — but inaccuracies and errors in those reports have unfairly prevented some renters from getting housing, sometimes because of criminal records that aren’t theirs.
Renters are increasingly falling victim to algorithmic discrimination by companies that instantaneously spit out reports for landlords based on publicly-available information like sex-offender registries and housing court documents to help them screen potential tenants.
Tenant screening used to be a matter of checking credit scores and references, but in recent years a billion-dollar industry has sprouted up around providing landlords with deep dives into the lives of people seeking housing — and because of a lack of human oversight and fact-checking, reports can include information about crimes and evictions that don’t pertain to the prospective tenants for whom they are inquiring.
Attorneys from Consummer Attorneys can assist renters with taking on the companies who have provided inaccurate information to landlords, fight to clear their names and even help them win financial compensation.
Landlords are legally allowed to obtain prospective renters’ credit records as well as their criminal histories during a screening process. Some background check companies provide them with color-coded assessments of the would-be tenants’ risk or a “pass-fail” grade as to whether or not they should be given housing.
And some renters who get turned down for a new place to live are often left in the dark as to the exact reason for their rejection.
Some renters have been shut out from housing because their names are similar or identical to to those of individuals with lengthy rap sheets, are listed on sex-offender registries or are found on arrest records.
In 2016, the US Department of Housing and Urban Development issued guidelines stating that, when poring over the criminal records of prospective renters, landlords should not only focus on the type of crimes committed, but other circumstances like how long ago the offense was committed — finding that focusing solely on criminal records disproportionately harms black and Latino people.
In one federal case filed in 2018, a Connecticut mother alleges that her application to move her disabled son into her apartment so that she could care for him was denied because he was busted for shoplifting in 2014. The suit has survived a motion to dismiss by CoreLogic, the background check company that produced the report, and is ongoing.
Under the Fair Credit Reporting Act, landlords are required to tell tenants if their credit histories or criminal records served as a basis to denying their applications for housing — which is called an “adverse action notice.”
Additionally, background check companies have 30 days to make corrections to renters’ reports if requested. But by the time that window has closed, it is likely that landlords have moved on to another potential tenant and rented out their property.
How We Can Help
If a renter has been denied housing based on erroneous information that a landlord obtained from an automated background check, the team at Consummer Attorneys can work with clients to determine if they have fallen victim to algorithmic discrimination and potentially clear their records to get a roof over their heads and hold the companies that provided the inaccurate information financial liable for damages.