7 October 2020
The team is examining the likely implications of data analytics and machine learning for the role of the media as it makes the shift from traditional mass-media modes of distribution towards more personalised news and platforms. With funding from the European Research Council and the SIDN fonds, they are working to pop the filter bubbles and harness the power of AI to facilitate the creation of better and more successful recommendations, through the development of tools that measure diversity in recommendations based on principles of democratic theory.
Team leader Natali Helberger: ‘One of the key findings of this project is that we need to think smarter about algorithms and AI in the media, and how to use them in a way that furthers the public mission of the media to inform users in these times of information overload and missinformation, and help them to consume a diverse media diet.’
New powers and new responsibilities
In today’s algorithm-driven world, news organisations have the power to actively guide and shape people’s news exposure. This power brings with it new responsibilities and raises new and fundamental questions about the role of news recommenders in accomplishing the media’s democratic mission. How diverse should recommendations be? How personally relevant and inclusive? How far should the media go in engaging with the audience, and what is the role of other values, such as participation, transparency, deliberation and privacy? What are the longer term societal implications of personalised information exposure? And more generally, what are the objectives and values that recommendations should be optimised for?
Individual interests such as privacy, autonomy and accuracy must be balanced against the opportunities that data and AI offer for better informing and even educating citizens. Algorithmic news recommendations in themselves are neither good nor bad for democracy. It is the way the media use the technology that creates threats, or opportunities.Natali Helberger
The team’s research shows that, so far, news recommendation algorithms have often been optimised for rather simplistic short-term metrics, such as clicks, time spent and past user behaviour. But, the researchers say, if we give more thought to how recommenders can help to inform people better, give them more diverse choices, while at the same time respecting their privacy and autonomy, news recommenders could be potentially extremely powerful and very useful tools for the news media. The research shows that, unfortunately, despite some good initiatives, there is still too often a disconnect between the R&D departments where the technology is being developed and the editors and journalists who have traditionally made the selection of what news to show the reader.
Smarter recommendation metrics
The UvA group’s insights in this area have brought them together with media companies in the Netherlands, Belgium, Germany and the UK, with whom they are working towards developing smarter recommendation metrics. Their approach towards diversity in recommendations has also been embraced by an international group of experts from computer science, law and communication science in the so-called Dagstuhl Manifesto. Furthermore, based on their work, they were asked by the Dutch Media Regulator to advice on its strategy for dealing with algorithmic recommendations on social media platforms, and by the Canadian government, in cooperation with UNESCO, to advise them on how to realise diversity in recommendations.
Positive and negative effects on freedom of expression
Another important insight of the team’s research concerns the legal and fundamental rights implications of news recommendation algorithms. Such algorithms can affect users’ rights to freedom of expression in both positive and negative ways. In negative ways to the extent that news recommendation and personalisation strategies can create new digital vulnerabilities, such as social media-only users who just see personalised news and no other news, or who are subject to manipulative targeting strategies (including political microtargeting). On the other hand, news recommendations can help people to find more relevant news, and the team’s empirical research has confirmed that users actually appreciate automated news recommendations - even more than those made by people - provided they respect users’ privacy and interest in consuming diverse news.
Based on their insights, the team were asked to advise the Council of Europe, and national regulators and ministries in the Netherlands, Germany and the UK on strategies towards building the kind of algorithms that do just that.