logo
SAIL 2020
Planning for Agency Succession

Facebook Algorithms Continue to Serve Discriminatory Ads

Facebook Algorithms Continue to Serve Discriminatory Ads

We wrote in our Summer 2019 Second Wind Magazine [“Recruiters Beware: Algorithms Tend to Learn Discrimination”] about discrimination concerns around Facebook employment advertising. It turns out that, despite changes Facebook made to targeting options, their ad-serving algorithms are still defeating diversity hiring aims.

Targeting Changes Are Having Little Impact

ProPublica reported that Northeastern University and non-profit research group Upturn partnered to determine whether changes Facebook made following a settlement of a civil rights lawsuit over discriminatory Facebook ads in housing, employment and credit advertising had actually made a difference in decreasing biased results. They found that, even with removal of targeting options like gender, age and race, ads often were served disproportionately to whites and males, largely excluding women, older workers, the disabled and people of color.

The faults are embedded in the algorithms. The coding “learns” from examples fed into a database that may default to target certain groups because the examples collection was biased toward that group—think engineering or technology fields that have historically been staffed by young white men. Biased targeting then increases because machine learning takes unintended cues from the race, gender or age of people shown in an ad. An example was a trucking firm that has good diversity, and would happily employ women and older people, but their Facebook ad was served to 87% men, with 39% aged between 25 and 44… because that’s the kind of candidate who appeared in the ad. 

The other problem is that, while some targeting options deemed discriminatory were removed from Facebook ad settings, users can still upload a “source audience” sample, and ask Facebook to find more people “like” that audience. Tests run by ProPublica indicated that, if the source audience sample is biased, the results will certainly also be biased.

The Algorithm Decides Who Sees Ads

Even where no targeting options were selected, Facebook ad algorithms self-targeted based on apparently embedded biases; a supermarket jobs ad was disproportionately served to women, while an AI jobs ad was served disproportionately to men. A Chicago-based trade union seeking greater recruit diversity for apprenticeships discovered that, even when their ads featured women and people of color, the ads were served to two-thirds men. And a Muslim advocacy group seeking college-age men and women to work on Capitol Hill in Washington, D.C., was alarmed to find 73% of their ads served to men, even though more women than men typically apply to the program. Did Facebook’s algorithms interpret “leadership” and “congressional” as male indicators? 

ProPublica even found examples in Facebook’s ads archive where non-targeted ads were served ONLY to men. They also bought dozens of ads for hiring and employment that they did not designate as in those categories, selected discriminatory targeting, and found Facebook accepted and immediately began running many of them. ProPublica quickly removed the ads.

It’s not just ads that are biased. Apple came under fire when men began reporting that they received massively higher credit lines for the new Apple Card than their female spouses, even when “her” credit score was superior to “his.” Among the men supporting the anecdotal discrimination was Apple co-founder Steve Wozniak. Some women professionals with excellent credit scores were outright denied Apple Cards. Apple claims their Goldman Sachs card partner is to blame, and GS blames “the algorithm”… as if that means it is automatically unfixable. Elsewhere more dire consequences from algorithm-generated biases could impact on decision-making in health care, criminal justice and education.

De-Regulation Means a Free Pass To Discriminate

Employers and civil rights activists can’t expect help from the U.S. Department of Housing and Urban Development (HUD). In September 2019, a new HUD rule stated that, providing algorithms are not specifically programmed to discriminate against protected classes, lenders and landlords cannot be held liable for disproportionate impacts on people of color. (This ignores that HUD is directly suing Facebook for violating fair housing advertising rules.) 

While other government entities with oversight on discriminatory marketing (the Securities and Exchange Commission and the Equal Employment Opportunity Commission among them) have occasionally sued employers and lenders for such practices, they aren’t putting pressure on platforms to fix biased ad-serving algorithms.

The bottom line is, until the user data behind the algorithms can be adjusted to be less biased, results of Facebook recruiting ad campaigns will continue to deliver biased results. If your agency, and your clients, want to avoid lawsuits for discriminatory advertising practices under government rules for credit, lending, housing or hiring, be very wary of not just Facebook ads, but all digital advertising platforms. Biases are embedded in the data that teach algorithms whom to serve your ads. 

Build a diversity recruiting program that does not rely solely on social media ads, and keep a close eye on results of any ad campaign you run, to ensure the audience you want is not being completely bypassed.

Caveat Emptor.

See also: How Bias Influences Hiring, And So Much More, Part 1

Meet Diversity Goals Through Bias Disruption

Is Your Agency Truly Diverse?

Building a Diversity Culture in Your Ad Agency

 

Comments

0 comments
Please to use this feature.
Comments for website administrator (optional):