
Facebook and Instagram parent company Meta Platforms Inc. and the U.S. Department of Justice announced Monday they had come to an agreement regarding compliance targets for a new algorithm system.
According to the DOJ, a lawsuit settlement from June required Meta to build a new system to “address algorithmic discrimination,” related to discriminatory housing advertisements.
Meta will be subject to court oversight and regular review of its compliance with the settlement through June 27, 2026, the DOJ said. It called the development a “key milestone” in its settlement with Meta to prevent discriminatory advertising in violation of the Fair Housing Act (FHA).
A U.S. complaint against the company alleged that Meta uses algorithms in determining which users receive advertisements – including housing advertisements – that rely in part on characteristics protected under the FHA. Specifically, Meta allegedly “feeds troves of user information into its advertisement delivery system, including information related to users’ FHA-protected characteristics such as sex and race, and uses that information in its personalization algorithms to predict which advertisement is most relevant to which user,” according to the DOJ.
Meta’s new system is called the Variance Reduction System (VRS) and it is intended to reduce the variances between the eligible audiences and the actual audiences of ads. The U.S. has determined that this new system “will substantially reduce the variances between the eligible and actual audiences along sex and estimated race/ethnicity in the delivery of housing advertisements,” and it will operate on all housing ads across Meta’s platforms.
Going forward, Meta is expected to meet compliance metrics in stages. One example is a Dec. 31 deadline to reduce most variances to less than or equal to 10% for 91.7% of those advertisements for sex and less than or equal to 10% for 81.0% of those advertisements for estimated race/ethnicity.
“This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division of the Monday announcement.
She added that monitoring of Meta should be a deterrent for other companies attempting to use discriminatory advertising practices.
U.S. Attorney Damian Williams for the Southern District of New York called it a “groundbreaking resolution,” and applauded Meta for “taking the first steps towards addressing algorithmic bias.”
According to the DOJ, Meta also selected Guidehouse Inc., an independent, third-party reviewer, to investigate and verify on an ongoing basis whether the VRS is meeting the compliance metrics agreed to by the parties. Meta is expected to provide both the U.S. and Guidehouse with regular compliance reports.
This is the first time Meta has been subject to court oversight for its advertisement targeting and delivery system.
“The court will have ultimate authority to resolve any disputes over the information that Meta must provide,” said the Justice Department. Additionally, Meta will “cease delivering housing advertisements using the Special Ad Audience tool (which delivered advertisements to users who ‘look like’ other users), and Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics.”