Facebook faces new allegations of gender discrimination in its delivery of job ads. Research by human rights group suggests it's a global concern


This story is part of 'Systems Error', a series by CNN As Equals, investigating how your gender shapes your life online. For information about how CNN As Equals is funded and more, check out our FAQs.

(CNN)Facebook-parent Meta is facing four new complaints from human rights groups in Europe alleging that the algorithm it uses to target users with companies' job advertisements is discriminatory, years after the company first pledged to crack down on the issue in other regions.

The allegations are based on research by international nonprofit Global Witness that it says shows Facebook's ad platform often targets users with job postings based on historical gender stereotypes. Job advertisements for mechanic positions, for example, were overwhelmingly shown to male users, while ads for preschool teachers were shown mostly to women users, according to data Global Witness obtained from Facebook's Ad Manager platform.
Additional research shared exclusively with CNN by Global Witness suggests that this algorithmic bias is a global issue, the human rights group says.
"Our concern is that Facebook is exacerbating the biases that we live with in society and actually marring opportunities for progress and equity in the workplace," Naomi Hirst, who leads Global Witness' campaign strategy on digital threats to democracy, told CNN.
Global Witness, jointly with nonprofits Bureau Clara Wichmann and Fondation des Femmes, on Monday filed the complaints regarding Meta (FB) to the human rights agencies and data protection authorities in France and the Netherlands, based on their research in both countries. The groups are urging the agencies to investigate whether Meta's practices violate the countries' human rights or data protection laws. If any of the agencies find the allegations substantiated, Meta could ultimately face fines, sanctions or pressure to make further changes to its product.
Global Witness previously filed complaints with the UK Equality and Human Rights Commission and Information Commissioner's Office over similar discrimination concerns, which remain under investigation. At the time, Global Witness said a spokesperson for Meta (which was still called Facebook at the time) told the group that its "system takes into account different kinds of information to try and serve people ads they will be most interested in," and that it was "exploring expanding limitations on targeting options for job, housing and credit ads to other regions beyond the US and Canada."
The European complaints also mirror a complaint filed with the US Equal Employment Opportunity Commission in December by women's trucking organization Real Women in Trucking, alleging that Facebook discriminates based on age and gender when deciding which users to show job ads to. Meta declined to comment to CNN about the Real Women in Trucking complaint.
Meta spokesperson Ashley Settle said in a statement that Meta applies "targeting restrictions to advertisers when setting up campaigns for employment, as well as housing and credit ads, and we offer transparency about these ads in our Ad Library."
"We do not allow advertisers to target these ads based on gender," Settle said in the statement. "We continue to work with stakeholders and experts across academia, human rights groups and other disciplines on the best ways to study and address algorithmic fairness."
Meta did not comment specifically about the new complaints filed in Europe. The company also did not respond to a question asking in which countries it now limits targeting options for employment, housing and credit ads.

Missing out on jobs because of your gender

Facebook has faced various claims of discrimination, including in its delivery of job advertisements, over the past decade. In 2019, as part of an agreement to settle multiple lawsuits in the United States, the platform promised to make changes to prevent biased delivery of housing, credit and employment ads based on protected characteristics, such as gender and race.
Efforts to address those disparities included removing the option for advertisers to target employment ads based on gender, but this latest research suggests that change is being undermined by Facebook's own algorithm, according to the human rights groups.
As a result, the groups say, countless users may be missing out on the opportunity to see open jobs they could be qualified for, simply because of their gender. They worry this could exacerbate historic workplace inequities and pay disparities.
"You cannot escape big tech anymore, it's here to stay and we have to see how it impacts women's rights and the rights of minority groups," said Linde Bryk, head of strategic litigation at Bureau Clara Wichmann. "It's too easy, as a corporation, to just hide behind the algorithm, but if you put something on the market ... you should also be able to control it."
Global Witness conducted additional experiments in four other countries — including India, South Africa and Ireland — and says the research shows that the algorithm perpetuated similar biases around the world.
With more than 2 billion daily active users around the world, Facebook can be a key source for helping users find job openings.
The platform's business model relies on its algorithm's careful targeting of advertisements to the users it thinks are most likely to click on them — so that ad buyers see returns from their spending on the platform. But Global Witness' research suggests that this results in job ads being targeted to users based on gender stereotypes. And in some cases, human rights advocates say, the biases that appear to be shown by Facebook's ad system may exacerbate other disparities.
In France, for example, Facebook is often used for job searches by people of lower income levels, meaning the people most affected by its alleged algorithmic biases may be those already in marginalized positions, said Caroline Leroy-Blanvillain, lawyer and member of the legal force steering committee at Fondation des Femme.
Pat de Brún, head of Amnesty International's big tech accountability team, said he was not necessarily surprised by the findings of Global Witness' research. "Research consistently shows how Facebook's algorithms deliver deeply unequal outcomes and often reinforce marginalization and discrimination," de Brún told CNN. "And what we see is the reproduction and amplification of some of the worst aspects of society."
"We have this illusion of neutrality that the algorithms can provide, but actually they're very often reproducing those biases and often obscuring the biases and making them more difficult to challenge," he said.

Gendered targeting

To conduct the experiments cited in the complaints, Global Witness ran a series of job ads in France and the Netherlands over two-day periods between February and April. The advertisements linked to real job postings found on employment websites, and researchers selected positions — including preschool teacher, psychologist, pilot and mechanic — traditionally associated with gender stereotypes.
Global Witness targeted the ads to adult Facebook users of any gender who resided in, or had recently visited, the chosen countries. The researchers requested that the ads "maximize the number of link clicks," but otherwise left it up to Facebook's algorithm to determine who ultimately saw the advertisements.
The ads were often shown to users along heavily gendered lines, according to an analysis of the data provided by Facebook's ad manager platform.
"Just because advertisers can't select it, doesn't mean that the 'gender' [category] doesn't weigh in the process of showing ads at all," one of the Netherlands complaints states.
In France, for example, 93% of the users shown a preschool teacher job ad and 86% of those shown a psychologist job ad were women, while women comprised just 25% of users shown a pilot job ad and 6% of those shown a mechanic job ad, according to Facebook's ad manager platform.
Similarly, in the Netherlands, 85% of the users shown a teacher job ad and 96% of those shown a receptionist job ad were women, while just 4% of those shown a job ad for a mechanic were women, according to Facebook's data. Certain roles were less strongly skewed — a package delivery job ad, for example, was shown to 38% women users in the Netherlands.
The results mirrored those Global Witness has found in the United Kingdom, where women were more often shown ads for nursery teacher and psychologist jobs, and men were overwhelmingly shown ads for pilot and mechanic positions.
In some cases, the degree of gender imbalance in how users were targeted for certain jobs varied by country — in India, just 39% of the users shown a psychologist job ad were women, while in Europe and South Africa, women were more likely than men to be shown psychologist job ads. A further exception was pilot ads shown in South Africa, which were more balanced, with 45% of users shown a pilot ad being women.
Global Witness also ran tests in Indonesia, but Facebook's ad manager was unable to identify the genders of many of the users who were shown the advertisements, making it difficult to conduct a robust analysis of the results there.
"Even though Facebook may have become less fashionable in certain countries, it remains the key communications platform for much of the world ... as the public square where public discourse happens," Amnesty International's de Brún said. "They should be ensuring these discriminatory outcomes do not happen, intentionally or not."
Because little information is publicly available about how Facebook's algorithm works, the complaints acknowledge that the cause of the gender skew was not exactly clear. One of the Netherlands complaints speculates about whether the algorithm may have been trained on "contaminated" data such as outdated information about which genders typically hold which roles.
Meta did not respond to questions from CNN about how the algorithm that runs its ad system is trained. In a 2020 blog post about its ad delivery system, Facebook said ads are shown to users based on a variety of factors, including "behavior on and off" the platform. Earlier this year, Facebook launched a "variance reduction system" — a new machine learning technology — to "advance equitable distribution" of housing ads in the United States, and said it planned to expand the system to US employment and credit ads.

Seeking algorithmic transparency

From November 2016 to September 2018, Facebook was hit with five discrimination lawsuits and charges from US civil rights and labor organizations, workers and individuals, alleging that the company's ad systems excluded certain people from seeing housing, employment and credit ads based on their age, gender or race.
The legal actions followed a slew of critical coverage of Facebook's advertising systems, including one 2018 ProPublica investigation that found Facebook was facilitating the spread of discriminatory advertisements by allowing employers using its platform to target users of only one sex with job ads. Some companies were targeting only men with ads for trucking or police jobs, for example, while others targeted only women with ads for nursing or medical assistant jobs, according to the report. (A Facebook spokesperson said in a statement responding to the report at the time that discrimination is "strictly prohibited in its policies" and that it would "defend our practices.")
In March 2019, Facebook agreed to pay nearly $5 million to settle the lawsuits. The company also said it would launch a different advertising portal for housing, employment and credit ads on Facebook, Instagram and Messenger offering fewer targeting options.
"There is a long history of discrimination in the areas of housing, employment and credit, and this harmful behavior should not happen through Facebook ads," then-Facebook COO Sheryl Sandberg said in a blog post at the time of the settlement. Sandberg added that the company had engaged a civil rights firm to review its ad tools and help it understand how to "guard against misuse."
Later that year, the US Equal Employment Opportunity Commission ruled that seven employers who bought Facebook ads targeting workers of only certain ages or genders had violated federal law.

How do I...

In addition to restricting advertisers from targeting employment, housing and credit ads based on gender, Facebook also prohibits targeting based on age and requires that location targeting have a minimum radius of 25 kilometers (or about 15.5 miles), the company says. For all advertisements on its platform, Facebook in 2022 removed targeting options based on sensitive characteristics, such as religious practices or sexual orientation. The company also requires advertisers to comply with its non-discrimination policy, and makes all ads available for anyone to view in its Ad Library.
Still, researchers have continued to find evidence that Facebook's delivery of job advertisements may be discriminatory, including a study out of the University of Southern California published in 2021.
In December, Real Women in Trucking filed its EEOC complaint alleging that Facebook's job ads algorithm discriminates based on age and gender. "Men receive the lion's share of ads for blue-collar jobs, especially jobs in industries that have historically excluded women," the complaint states, while "women receive a disproportionate share of ads for lower-paid jobs in social services, food services, education, and health care."
"People don't look for jobs or housing in newspapers, or even the radio, anymore, they go online, that's where all information flows for economic opportunities," said Peter Romer-Friedman, one of the attorneys representing Real Women in Trucking. "If you're not part of the group that's receiving the information, you lose out on the opportunity to hear about and pursue that job."
Romer-Friedman was also on the negotiating team that worked on the 2019 settlement agreement with Facebook. At the time, he said, he and others raised concerns that while Facebook's promised changes were a step in the right direction, the same bias issues could be replicated by the platform's algorithm.
Meta declined to comment on the EEOC complaint from Real Women in Trucking; filings in cases with the agency are not publicly available.
The French and Dutch agencies will have discretion about whether to take up the investigations requested in the latest complaints. Global Witness and its partners say they hope that potential decisions by the human rights agencies on their findings could put pressure on Meta to improve its algorithm, increase transparency and prevent further discrimination. Meta could ultimately face significant fines if the countries' data protection agencies decide to investigate the issue and ultimately find the company to have violated the EU's General Data Protection Regulation, which prohibits discriminatory use of user data.
"What we're hoping with these complaints is that it forces [Facebook] to the table to crack open the black box of their algorithm, to explain how they can correct what appears to be ... discrimination by their algorithm," Global Witness' Hirst said. "I think we know enough about gendered workforces and gendered jobs to say that Facebook is adding to the problem."
Commissioning Editor: Meera Senthilingam
Editor: Seth Fiegerman
Data and Graphics Editor: Carlotta Dotto
Illustrations: Carolina Moscoso for CNN
Visual Editors: Tal Yellin, Damian Prado, David Blood and Gabrielle Smith