Amsterdam Law School
8 mei 2023
The algorithms of social media make sure you stay on a page as long as possible. Time is money. Leerssen examined the regulation surrounding this software. 'The recommendation systems play an important role in the media landscape: they largely determine what becomes popular. That’s why the public interest in regulation is massive,' he says.
Leerssen took a critical look at regulation. 'I see many laws and regulations coming up to promote transparency. But lawyers also know that ensuring transparency is easier said than done.'
Fixating algorithms is not the solution, according to Leerssen. 'There is now too much emphasis on the transparency of algorithms behind recommendation systems. The problem is: you cannot understand how a system works if you only look at the technology. You also need to understand how people use that technology. When analysing traffic accidents, you look at the technical specifications of a car, but also at the traffic conditions and whether the driver has been drinking alcohol. It is therefore important to look closely at the social context in which a technology is used.'
There is nothing in the algorithm of Facebook programmed to reward disinformation or racism
In order to get a clear picture of the impact of recommendation systems, Leerssen believes that you have to look beyond transparency and ask the right questions. ‘The machine-learning algorithms of social media are very complex, so it’s often more effective to look at outcomes. By asking simpler questions, you also get answers that are useful to many people.'
'For example, there are tools for parents to see what their children are doing on social media. That way you don't have to get to the bottom of the algorithm, but you can see the outcomes.' You could also find out what the background is of the people who end up in a right-wing extremist trap. Such questions are difficult to answer, but not as complicated as fully explaining a machine-learning algorithm's choices. 'It's important to start the discussion with these questions.'
'About Facebook, for example, it is said that the algorithm is racist and that disinformation is spread. This is not only in the algorithm, but also in the preferences of users. There is nothing in the algorithm programmed to reward disinformation or racism. You can only explain that effect by looking at user behavior, which the algorithm responds to. Then again, it's not just down to the users, he nuances. It is a complex interaction between the users and the technology. That is what you as a society must gain insight into.'
A clearer view of the recommendation systems is desperately needed, according to Leerssen. He worries about the often invisible grip that social media platforms have on the media landscape. ‘The discussion used to be much more about removing online content. That's a visible choice you can take issue with. But nowadays, more subtle methods are being used to make content less findable in recommendation systems. For example, is content from LGBTQ+ YouTubers less visible? As a user, it is hard to tell whether this is a conscious intervention by the platform or an unexpected quirk of the algorithmic system. The power of the platform becomes less visible.'
Ensuring more transparency is therefore also important. It just shouldn't be the sole focus of new regulation, according to Leerssen. The government must offer more countervailing power to platforms. 'Transparency is a means for good regulation or supervision, but not a complete alternative. In my dissertation I try to add something to the discussion: where are the opportunities and what are the dead ends?'