Integrity Score 90
No Records Found
No Records Found
No Records Found
Despite coming under fire for Racist Ads, Discriminatory Targeting Continues in Facebook
In 1968, the Congress enacted the Fair Housing Act with a central objective to prohibit racist discrimination in house sales and rent. Thirty years later, racist discrimination continues to prevent Black people from accessing housing, and Facebook Ads seem to be making matters worse.
In 2016, investigative journalist Julia Angwin wrote in ProPublica about how Facebook not only gives advertisers the option to create ads targeted by user interest and background, but also gives them the ability to exclude users on the basis of race and gender through a group it refers to as ‘Ethnic Affinities’.
This sort of exclusion is prohibited by federal law under the Fair Housing Act, and six years after Angwin’s article, Meta still hasn’t fixed its racist ad system.
Despite Meta’s attempts to fix its algorithms, ads continue to perform discriminatory targeting: “This is the problem with automated systems. They can create discrimination even when discriminatory variables are removed from their inputs because they often have enough information to make surprisingly accurate inferences,” Angwin writes in the New York times.
Angwin proposes holding Big Tech accountable by drawing a distinction between speech and conduct, so that companies stop claiming immunity under Section 230, which gives them the power to avoid addressing the harms that their technologies enable.
“Without liability, the price of doing nothing will always outweigh the cost of doing something,’ Angwin writes.