Current events

How to put an end to gender biases in internet algorithms
Women

A well-known recruitment tool that preferred male over female applicants because of algorithms (photo: Becca Tapert)

23/11/2022
Beatriz de Vera

According to experts, many internet algorithms are based on stereotypes, leading them to associate the sciences with masculinity and the arts with femininity


This research indicates that everyone involved in developing algorithms needs to understand the measures available to minimize potential biases and to employ those measures to prevent bias occurring

Endless screeds have been penned on whether the internet algorithms with which we constantly interact suffer from gender bias, and all you need to do is carry out a simple search to see this for yourself. However, according to the researchers behind a new study that seeks to reach a conclusion on this matter, "until now, the debate has not included any scientific analysis". This new article, by a transdisciplinary team, puts forward a new way of tackling the question and suggests some solutions for preventing these deviances in the data and the discrimination they entail.

Algorithms are being used more and more to decide whether to grant a loan or to accept applications. As the range of uses for artificial intelligence (AI) increases, as do its capabilities and importance, it becomes increasingly vital to assess any possible prejudices associated with these operations. "Although it's not a new concept, there are many cases in which this problem has not been examined, thus ignoring the potential consequences," stated the researchers, whose study, published open-access in the Algorithms journal, focused mainly on gender bias in the different fields of AI.

Such prejudices can have a huge impact upon society: "Biases affect everything that is discriminated against, excluded or associated with a stereotype. For example, a gender or a race may be excluded in a decision-making process or, simply, certain behaviour may be assumed because of one's gender or the colour of one's skin," explained the principal investigator of the research, Juliana Castañeda Jiménez, an industrial doctorate student at the Universitat Oberta de Catalunya (UOC) under the supervision of Ángel A. Juan, of the Universitat Politècnica de València, and Javier Panadero, Universitat Politècnica de Catalunya - BarcelonaTech (UPC).

According to Castañeda, "it is possible for algorithmic processes to discriminate by reason of gender, even when programmed to be 'blind' to this variable". The research team –which also includes researchers Milagros Sáinz and Sergi Yanes, both of the Gender and ICT (GenTIC) research group of the Internet Interdisciplinary Institute (IN3), Laura Calvet, of the Salesian University School of Sarrià, Assumpta Jover, of the Universitat de València, and Ángel A. Juan– illustrate this with a number of examples: the case of a well-known recruitment tool that preferred male over female applicants, or that of some credit services that offered less favourable terms to women than to men. "If old, unbalanced data are used, you're likely to see negative conditioning with regard to black, gay and even female demographics, depending upon when and where the data are from," explained Castañeda.

 

The sciences are for boys and the arts are for girls

To understand how these patterns are affecting the different algorithms we deal with, the researchers analysed previous works that identified gender biases in data processes in four kinds of AI: those that describe applications in natural language processing and generation, decision management, speech recognition and facial recognition.

In general, they found that all the algorithms identified and classified white men better. They also found that they reproduce false beliefs about the physical attributes that should define someone depending upon their biological sex, ethnic or cultural background or sexual orientation, and also that they made stereotypical associations linking men with the sciences and women with the arts.

Many of the procedures used in image and voice recognition are also based on these stereotypes: cameras find it easier to recognize white faces and audio analysis has problems with higher-pitched voices, mainly affecting women.

The cases most likely to suffer from these issues are those whose algorithms are built on the basis of analysing real-life data associated with a specific social context. "Some of the main causes are the under-representation of women in the design and development of AI products and services, and the use of datasets with gender biases," noted the researcher, who argued that the problem stems from the cultural environment in which they are developed.

"An algorithm, when trained with biased data, can detect hidden patterns in society and, when operating, reproduce them. So if, in society, men and women have unequal representation, the design and development of AI products and services will show gender biases."

 

How can we put an end to this?

The many sources of gender bias, as well as the peculiarities of each given type of algorithm and dataset, mean that doing away with this deviation is a very tough – though not impossible – challenge. "Designers and everyone else involved in their design need to be informed of the possibility of the existence of biases associated with an algorithm's logic. What's more, they need to understand the measures available for minimizing, as far as possible, potential biases, and implement them so that they don't occur, because if they are aware of the types of discriminations occurring in society, they will be able to identify when the solutions they develop reproduce them," suggested Castañeda.

This work is innovative because it has been carried out by specialists in different areas, including a sociologist, an anthropologist and experts in gender and statistics. "The team's members provided a perspective that went beyond the autonomous mathematics associated with algorithms, thereby helping us to view them as complex socio-technical systems," said the study's principal investigator.

"If you compare this work with others, I think it is one of only a few that present the issue of biases in algorithms from a neutral standpoint, highlighting both social and technical aspects to identify why an algorithm might make a biased decision," she concluded.

 

This UOC research fosters Sustainable Development Goals (SDG): 5, Gender equality; and 10, Reduced Inequalities.

Related article

CASTANEDA, J.; JOVER, A.; CALVET, L.; YANES, S.; JUAN, A. A.; SAINZ, M. Dealing with Gender Bias Issues in Data-Algorithmic Processes: A Social-Statistical Perspective. In: Algorithms (2022) https://doi.org/10.3390/a15090303

 

UOC R&I

The UOC's research and innovation (R&I) is helping overcome pressing challenges faced by global societies in the 21st century, by studying interactions between technology and human & social sciences with a specific focus on the network society, e-learning and e-health.

The UOC's research is conducted by over 500 researchers and 51 research groups distributed between the university's seven faculties, the E-learning Research programme, and two research centres: the Internet Interdisciplinary Institute (IN3) and the eHealth Center (eHC).

The University also cultivates online learning innovations at its eLearning Innovation Center (eLinC), as well as UOC community entrepreneurship and knowledge transfer via the Hubbik platform.

The United Nations' 2030 Agenda for Sustainable Development and open knowledge serve as strategic pillars for the UOC's teaching, research and innovation. More information: research.uoc.edu

UOC experts

Juliana Castañeda

Juliana Castañeda

UOC industial doctoral student

Photograph of Milagros Sáinz Ibáñez

Milagros Sáinz Ibáñez

Researcher at the Internet Interdisciplinary Institute (IN3)

Expert in: Gender stereotypes and roles; academic motivation and choice of studies in adolescence; gender and attitudes towards technology; careers in technology.

Knowledge area: Social psychology.

View file
Sergi Yanes

Sergi Yanes

GenTIC researcher

Related links