For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Talking to the perpetrator of a crime or at a memorial of a deceased person can help process grief or trauma. Using deepfake technology, you can now do that with a virtual avatar. This is safer than a real encounter, and is perceived by users as lifelike.

How does a mouth move when you pronounce the ‘o’? Or the ‘a’? Translating sounds into mouth movements is one of the techniques Theo Gevers and colleagues taught a computer. As a result, a virtual avatar makes exactly the right mouth movements when a certain sound is heard, making it appear realistic. This is so-called deep faking technology, made famous by funny movies, but also by actresses whose faces have been unwillingly used in porn movies, and by politicians. You can let them say things they have never actually said. But the same technology can also be used in mental health care, for trauma therapy and grief counselling.

Theo Gevers. Photographer: Dirk Gillissen
Theo Gevers. Photographer: Dirk Gillissen

Deep learning

Gevers is a professor of computer vision at the University of Amsterdam and group leader of the Computer Vision research group (CV). He has been working on image recognition for his entire career. Until a decade ago, this involved a lot of manual work to recognise what is in a picture. Until a new technique broke through: deep learning. Algorithms could now recognise images faster and independently. And in the last few years, another new development has been added: image generation, as you can do with programmes like DALL-E and Midjourney. ‘We are now in a very hectic, dynamic and interesting period because this generative form of AI is suitable for various application areas, such as climate prediction, new drug and material development and healthcare, for example. I think AI can mean an awful lot to society,’ concludes Gevers.’

Back in 1997, long before many were busy making deepfakes with celebrities, Gevers and his team developed the technology to generate and detect deepfakes. But they also explored applications for using deepfakes such as trauma and grief processing. In collaboration with trauma experts, the first results were published and proved promising. Not only for trauma therapy, but also for grief processing. ‘For some people, grief can be very problematic, for example when someone dies suddenly or in a violent way,’ he says. ‘Deepfake therapy as part of grief treatment can help bereaved people.’ 

We are now in a very hectic, dynamic and interesting period because this generative form of AI is suitable for various application areas, such as climate prediction, new drug and material development and healthcare, for example. I think AI can mean an awful lot to society,’ concludes Gevers. Theo Gevers

Video conversation with avatar

Gevers contacted psychological researchers and mental health therapists. Since then, several studies have begun using deepfake therapy for depression, trauma, grief and PTSD. A deepfake session goes as follows: a client takes place behind a computer screen and starts a video conversation with his or her therapist. The latter first comes on screen and explains what is going to happen. Then the therapist switches to another screen where the deepfake avatar, with the face of the possible perpetrator, comes into view. The client can now talk to the avatar. The therapist answers but is not visible. His or her voice can be used, but it is also possible to use the offender's voice. The visual impact of seeing the possible offender, makes the most impression, says Gevers, but with the voice added, the impact is even greater. In addition, it is possible to use the therapist's real movements, but sometimes a ‘loop’ is made so that 20 seconds of moving images are repeated over and over again. Something that Gevers says is hardly noticeable.

In conversation with offender

The first results of the experiments are promising: clients report that it helps them cope with trauma and grief. This method is now increasingly being used to confront victims with perpetrators in a safe way. Gevers works with psychologists, ethicists and trauma and grief experts to develop guidelines that put the well-being of victims at the centre. Gevers: ‘In rape cases, victims often struggle with the question: was it my fault? They want to ask that to the perpetrators. But that is dangerous to ask in real life. The perpetrator might hurt the victim even more.’ The therapists, who direct the avatars, work with scripts. A lot of thought has gone into preparing how to best respond to certain questions.’

Startup

The technology is now ready to be deployed, which is happening at the startup 3DUniversum, of which Gevers is the co-founder. Therapists can register on the platform, and after an intake, they can upload pictures and audio files, and organise a deepfake session.

A next research step is to automate the scripts for conversations before and after a therapy. ‘Before therapy starts, there is always an intake, with the aim of getting a comprehensive and detailed picture of a client's psychological, medical, social and personal background. This helps the psychologist better understand the client's issues, make an accurate diagnosis and treatment plan. This takes a lot of time and there are few therapists, so this could also be done by a therapeutic chatbot. For this, we want to train an AI model with lots of existing questions and answers, so that the model itself learns when to ask which question, and when to ask further questions, for example. It can then pick up signals that a client is facing a possible problem. If so, then the real therapist can be called in.’ Will AI also be able to provide full deepfake therapy in the future? Gevers is not so sure about that. ‘For simple therapies maybe, but grief and trauma therapy require specific expertise and a human approach,’ he says.

Mental health is less easy to solve because it is not so easily measurable. But it is certainly possible to heal wounds caused by trauma using AI.’