Voor de beste ervaring schakelt u JavaScript in en gebruikt u een moderne browser!
Je gebruikt een niet-ondersteunde browser. Deze site kan er anders uitzien dan je verwacht.
Paulina von Stackelberg and Myrthe Blösser, researchers at the UvA Amsterdam Business School, are very interested in data and bias, or data and gender bias to be precise. They routinely experience first-hand where that can lead.

Von Stackelberg often gets adverts for plastic surgery via social media, while Blösser is bombarded with adverts for diet pills. These are subjects that simply don’t interest them but show up on their timeline. And when Blösser recently asked Google to describe the concept of ‘masculine’, it responded with ‘courageous and adventurous’. When she then asked for a description of ‘feminine’, the algorithm came back with ‘sensual and attractive’.

Myrthe Blösser
Myrthe Blösser

Taking responsibility

Von Stackelberg and Blösser are worried about this development. This is why they’ve set up FemData, a series of events where they discuss the subject with guests. They themselves will be speaking at an event on 12 June as they’ve been invited to talk about data and gender bias at SPUI25, a platform for cultural-cum-academic issues. The researchers want to use the event to make clear that data is not always objective and that we need to be aware of this. ‘It’s not only important for researchers but also for other groups like people who build models,’ Blösser says. ‘We should engage in conversation to address the responsibility you have when you use data. People often fail to acknowledge that data, too, can be biased and, even if they do see this, they often don’t know how to deal with it. We want to initiate a dialogue with as many people from as many disciplines as possible and thus focus attention on the issue. As long as there’s bias ‒ and unfortunately it will always be with us ‒ we would be well-advised to take account of the fact that it will end up in our data.'

Von Stackelberg goes on to say: ‘This can also be a problem in high stakes settings such as Medicine. For instance, there has been research on a set of medical prediction models where the researchers found that women ended up having higher false negative rates across classifiers, which can lead to worse treatment outcomes  ‘Blösser adds: ‘Sometimes, it’s clear to all of us, like the recommendations you get from Netflix. Another example is when I’m looking for images to add to a piece of research, let’s say a nurse, and can only choose from images of females whereas, if I need a data scientist, I can only choose from images of males. But quite often, the prejudice in data is more subtle. That’s why it’s so important to talk about it and start a dialogue.’

Paulina von Stackelberg
Paulina von Stackelberg

Alternative solutions

There’s no single ready-made solution to the problem but there are some promising developments. Von Stackelberg explains: ‘It’s essential to address the problem at different levels and to share knowledge. As an example, everybody could be given access to software that can detect skewed data so that not only people with extensive statistical knowledge, like statisticians, can detect and mitigate bias on a technical level, but also people who might work with data but who are not specialized in the modeling aspect. Awareness is important as well. We should learn about the risks of biased data from an early age. In my opinion, this is a responsibility of the educational system. You should teach people how biases arise and how they affect our daily lives. 'Bias has a wide range of origins, many of them going beyond what you immediately see. Therefore, I think that we need multiple layers of solutions: First of all, we need technical solutions. Secondly, we need to consider how bias arises, among other things by seeing it in a social science and historical context. Lastly, it is important to incorporate a more policy-oriented perspective as well.’ Blösser chimes in: ‘It’s never too soon to begin raising awareness. Whatever your views on gender, I think we all agree that discrimination on the basis of somebody’s gender is a bad thing.’ Von Stackelberg continues with: ‘As I see it, a dialogue about data can begin much earlier. Even at a young age, you can learn to look at information with a critical eye. It’s a good thing to pay attention to AI literacy, teach children what an algorithm is and how it’s created.’

Handling bias with care

What would also help is a greater variety of people who work on models. ‘If you have personal experience with prejudice, there’s a good chance you’ll be more alert to it when you’re building a model,’ Von Stackelberg says. ‘But I also see that this is a sector with high-pressure work and little time for change. It’s not as if the onboarding of a few female data analysts will suddenly unearth all the biases.’ Blösser agrees: ‘Even with a diverse team, there’s no guarantee you’ll take more care in tackling bias. In the companies we talked to, there’s simply not enough time for the staff to delve into the issue more deeply. It’s just not a priority. But this has to change. After all, we’re in it together when it comes to determining what our society should look like and taking responsibility.’

Register for the SPUi25 event