According to HUD’s charge, Facebook enabled advertisers to exclude people whom Facebook classified as parents; non-American-born; non-Christian; interested in accessibility; interested in Hispanic culture; or a wide variety of other interests that closely align with the Fair Housing Act’s protected classes.
Evoking memories of pre-Civil Rights era redlining, HUD is also charging that Facebook enabled advertisers to exclude people based upon their neighborhood by drawing a red line around those neighborhoods on a map. Facebook also allegedly gave advertisers the option of showing ads only to men or only to women.
Facebook has 210 million U.S. users. The majority of its revenue, almost $17 billion in the fourth quarter of 2018 alone, comes from advertising, making it the largest digital advertising platform in the nation.
HUD's lawsuit asserts that Facebook also uses the protected characteristics of people to determine who will view ads regardless of whether an advertiser wants to reach a broad or narrow audience. HUD claims Facebook combines data it collects about user attributes and behavior with data it obtains about user behavior on other websites and in the non-digital world.
Facebook then allegedly uses machine learning and other prediction techniques to classify and group users to project each user’s likely response to a given ad, and in doing so, may recreate groupings defined by their protected class.
By grouping users who have similar attributes and behaviors (unrelated to housing) and presuming a shared interest or disinterest in housing-related advertisements, Facebook’s mechanisms function just like an advertiser who intentionally targets or excludes users based on their protected class, according to HUD.
“Facebook is discriminating against people based upon who they are and where they live,” said HUD Secretary Ben Carson. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”