5/14/24 · Communication

Digital ageism on dating apps: discrimination affects people aged 30 onwards

UOC researchers study age biases on these platforms
A person checking the mobile phone

Photo: Andrej Lišakov/Unsplash

When they decide to use an online dating app, users expect to make a match with a partner they think they will have chosen. However, this could not be further from the truth. It's not only the user who chooses the other person but also Artificial Intelligence (AI), and the problem is that "it doesn't always take into account what users want," said Juan Linares-Lanzman, a researcher with the Communication Networks and Social Change (CNSC) research group at the UOC's IN3. "AI reproduces and amplifies all kinds of stereotypes: racist and sexist ones and also age-related ones. You only have to go into Google and search for age dating apps to find questions such as 'is 30 too old for dating apps?' to learn that the ageist debate is alive online," he said.

The Universitat Oberta de Catalunya (UOC) researcher explained that both the data collection and algorithm processing can lead to age biases and other forms of discrimination. "This is when automated systems contribute to normalizing ageism," he said.

Indeed, Andrea Rosales, a researcher in the same research group at the UOC, noted that Tinder has been repeatedly accused of charging users different rates depending on their age. "Dating apps ask users for their age and for the age range they want to match with," she explained. Only some apps allow you to hide your age, and only if you're a premium user. "Age is therefore at the heart of the business of the algorithmic colonization of love, which restricts users' ability to explore relationships spontaneously or beyond personal prejudices," said Rosales, who is also a member of the UOC's Faculty of Information and Communication Sciences.

"The platform," said Rosales, "favours the visibility of users who, according to its predictions, are more popular, who are probably the youngest, the most attractive or those who make the most matches on a regular basis, rather than those who use the application with the same aims, such as forming a long-term relationship."

All these age biases are known as digital ageism. "Ageism is based on how age is represented and manifested in relation to digital technologies," said Linares-Lanzman. He was concerned that, in a hyperconnected world, this discrimination spreads everywhere and can become entrenched both in interpersonal and institutional relationships, warning that it particularly affects older and racialized women. "Ageist forms of exclusion lead, for example, to reduced potential interest in digital technologies and, ultimately, to undermining older people's self-esteem," he said.

 

At what age do people suffer from digital ageism?

Digital ageism negatively affects those people who are considered older in a particular context. But, as noted by Dr Linares-Lanzman, in digital ageism it's not just your chronological age that matters but also individual, social and cultural aspects. "It's based on a sociocultural construct," he said.

For example, on Tinder, most users are between 20 and 30 years old and are also the most likely to find people their age, make more matches and appear in more searches. From the age of 30, they're considered older.

Another example is the tech industry, which considers programmers to be of a certain age once they reach 35 years of age. "In Silicon Valley, people talk about the difficulty of finding work for people over 35. The results of US polls are very worrying: three out of every four workers confirm the existence of ageism, and 80% of them are worried about how age may affect their careers," said Linares-Lanzman. This youth of technology industry professionals also affects how digital technologies are conceived, as they are usually aimed at a young audience while ignoring older people's interests, skills and values.

 

Can we overcome digital ageism?

According to experts, the solution "is neither simple nor easy to imagine. […] They often want us to believe that this solution will come from the tech industry itself, because it's one of the main players involved, but what their formula is seeking is profitability. The more social problems there are, the more tech solutions they want to sell us," said Linares-Lanzman. He explained that this formula is already being seen with the use of AI for recruitment. "There's a dual discourse: on the one hand, the discourse of legislation and the ethics of inclusion, equity and diversity; and, on the other, that of the technological solutions that will end age biases in the workplace with AI solutions, but without explaining how they'll do it or how effective they'll be in real terms."

The researcher insisted that the main problem with digital ageism and possible solutions is how these companies work. "They're black boxes: we don't know anything about what data they're using or about their algorithms, and it's obvious that there's no real transparency or much willingness to provide it," he said. According to Linares-Lanzman, the recent AI law adopted by the European Union symbolizes a breakthrough in this regard, but he made it clear that it's "David versus Goliath".

Still, in his opinion, there are many things that can be done to combat ageism: "The first step is to be aware of ageist gestures in society, and speak out against ageism whenever it is identified."

 

European project

To understand how this discriminatory phenomenon works, Linares-Lanzman and Rosales from the UOC, together with a team of researchers from the University of Brighton (United Kingdom), Leiden University (Netherlands) and the Weizenbaum Institute (Germany), are preparing a study to critically assess how ageism works in systems, products, services and infrastructures where there is some form of AI. The project, Ageism in AI: new forms of age discrimination and exclusion in the era of algorithms and artificial intelligence (AGEAI), seeks to explore relevant areas such as healthcare, employment and recruitment systems, mobility and transport, financial services and the cultural industry.

Este proyecto de investigación de la UOC favorece los Objetivos de Desarrollo Sostenible (ODS), 10, sobre la reducción de las desigualdades.

Related article

Rosales, Andrea; Linares-Lanzman, Juan. «Sí, las ‘apps’ de citas discriminan a los usuarios mayores». COMeIN [en línea], abril 2024, no. 142. ISSN: 1696-3296. DOI: https://doi.org/10.7238/c.n142.2425

Experts UOC

Press contact

You may also be interested in…

Most popular

See more on Communication