It is a tradition, in the UK, for the Queen to address the nation on Christmas day, via a message which is broadcast on BBC. Another tradition is for Channel 4 (a commercial TV channel) to broadcast its “alternative Christmas message”, at the same time. The message usually flags a contemporary, social issue – such as mass surveillance (alternative Christmas message of 2013), or the Grenfell Tower fire (alternative Christmas message of 2017). This year, Channel 4 decided to use this opportunity to focus on deepfakes.
The term “deepfake” refers to the use of deep machine learning techniques to manipulate image and/or audio content in order to produce an output (e.g., a video) which is different from the original (i.e., a fake). These fake outputs are so seamless that they have the potential to deceive.
Channel 4’s alternative Christmas message for 2020l was a ‘face swapping’ deepfake, in which the face of the UK’s monarch was superimposed on an actor’s body, who also mimicked the Queen’s voice and mannerisms. Here is the output:
And here is the “making of”, showing how the production team put it all together:
Predictably, Channel 4’s initiative attracted substantial criticism online because of the subject of the video (i.e., the Queen). However, what I found quite surprising was the reaction of some experts who claimed that the initiative was counter-productive because this technology is still not being widely used for purposes of misinformation. In my view, it’s a bit like saying that we shouldn’t draw attention to possible foreign interference in US elections because it only happened a couple of times.
To be clear, deepfakes create interesting opportunities for individuals, as well as organisations. However, like other technologies, they also have a downside. Kietzmann and colleagues, discuss a number of examples of what they term the bright and the dark sides of deepfakes, in the paper mentioned above:
|Level||Bright side||Dark side|
|Individuals||Entertainment – “We may soon enjoy injecting ourselves into Hollywood movies and becoming the hero(ine) in the games we play on our phones or game consoles.”|
Convenience – “Instead of going to the store, we might ‘deepfake ourselves’ by sharing our photos – and eventually, our personal decoders – in order to create virtual mannequins that model different outfits on us”.
|Consent – “[People] did not consent to their portrayal in the deepfakes and might object to them strongly”|
Reputation – “The harm this can do to us all becomes even clearer in the case of a then-18- year-old female – an ordinary, nonfamous citizen – who one day discovered hundreds of explicit deepfake images and videos with her face on the bodies of porn actresses. These deepfakes not only put her reputation at risk, but also her emotional well-being, her career prospects as an aspiring lawyer, and her physical safety.”
Emotional harm – “With such a powerful technology and the increasing number of images and videos of all of us on social media, anyone can become a target for online harassment, defamation, revenge porn, identity theft, and bullying”.
|Organisations||Personalisation – see virtual mannequin example, above|
Cost – “(C)elebrities can simply make their personal deep network models available so that deepfake footage can be created without the need for travel to a video shoot, for example.
Fixing mistakes in audio-visual production – “(F)ace-swapping (aka face- leasing) and voice dubbing will be popular so that movie or advertising producers can fix misspoken lines or make script changes without rerecording footage and create seamless dubs of actors speaking different languages.”
Ease of visual production – “(A)ctors can look older or younger with the use of deepfakes instead of time-consuming make-up.”
|Industry disruption – “Technological advancements often make incumbents redundant. For example, the entire dubbing and re-voicing industry, which has long translated movies so that the new words match the original lip movement of the actor, is endangered and at risk of becoming extinct now that languages and lips can be changed”.|
Fraud – “Many unsuspecting firms will fall victim to trickery.”
Misinformation – “Videos that deliberately state false earnings estimates will hurt stock prices, and deepfake videos of CEOs in compromising situations will impact their firms’ reputation and put stakeholder agreements at risk.
Blackmail – “Opportunities for algorithmic blackmail, in which managers are offered a choice to either pay a fee to stop a deepfake from being shared or suffer the very public consequences.”
|Governments||Effective communication – “Ability to communicate with various stakeholders in a way that is accessible to them. For instance, a public service announcement can be broadcast in a number of different languages.”||Sabotage – “A government leader could be shown covering up a misdeed or making racist remarks just before an election or a major decision”.|
Undermining of democracy – “(It) will be irresistible for nation-states to use in disinformation campaigns to manipulate public opinion, deceive populations and undermine confidence in institutions”.
For all of these reasons, I think that Channel 4’s deepfake video is a very good initiative, indeed. We desperately need to raise public awareness and education about this technology, and the broad range of opportunities and threats that it presents! My only regret is that this important message was released at a time when the public’s attention was completely taken with other matters.
Was Channel 4’s “Deepfake Queen” video a good idea?