Last June, the FBI issued a public alert revealing a surge in ‘reports from victims, including minors and non-consenting adults, whose photos or videos were altered into explicit content.’ The same concern was echoed in a Europol report this March, underscoring the growing criminal use of deepfakes for extorting money from someone by threatening to reveal “evidence” of their sexual activity. ‘Deepfakes’ are AI-generated videos so hyper-realistic that they’re virtually indistinguishable from reality. In recent years, they have become the tool of choice for cybercriminals seeking to blackmail and exploit victims. Safety, privacy and consent have become key issues in the discourse about deepfakes – an MIT tech review estimated that about 95% of deepfake videos are now non-consensual porn, and of those videos, 90% target women—mostly underage! What’s more, revenge pornography videos account for 96% of all deepfake videos (Durbin & Graham, 2024). This article explores the terrifying lack of strict legislation surrounding deepfakes which makes it challenging to hold perpetrators accountable, and how that exacerbates the psychological impact on victims of such deepfake sextortion.
Last June, the FBI issued a public alert revealing a surge in ‘reports from victims, including minors and non-consenting adults, whose photos or videos were altered into explicit content.’ The same concern was echoed in a Europol report this March, underscoring the growing criminal use of deepfakes for extorting money from someone by threatening to reveal “evidence” of their sexual activity. ‘Deepfakes’ are AI-generated videos so hyper-realistic that they’re virtually indistinguishable from reality. In recent years, they have become the tool of choice for cybercriminals seeking to blackmail and exploit victims. Safety, privacy and consent have become key issues in the discourse about deepfakes – an MIT tech review estimated that about 95% of deepfake videos are now non-consensual porn, and of those videos, 90% target women—mostly underage! What’s more, revenge pornography videos account for 96% of all deepfake videos (Durbin & Graham, 2024). This article explores the terrifying lack of strict legislation surrounding deepfakes which makes it challenging to hold perpetrators accountable, and how that exacerbates the psychological impact on victims of such deepfake sextortion.
Laws are crucial for addressing deepfakes because they can govern the creation, distribution, and malicious use of deepfakes. Legislation provides a framework to combat this growing threat. Clare McGlynn, a law professor specialising in issues related to pornography and online abuse, emphasises the critical role of legislation in shaping societal norms – it not only deters individuals from creating deepfakes but also sends a clear message about the wrongfulness and harm of such activities. However, the evolution of deepfake technology is outpacing the ability of legal systems to keep up. In the US, no federal legislation currently exists to address the potential threats of deepfakes (Princeton Legal Journal, 2023). In the EU as well, there is no law specifically addressing the regulation of deepfakes (van der Sloot & Wagensveld, 2022).
Last year, NBC reported what happened to internet celebrity QTCinderella who was the subject of hundreds of deepfake pornography videos. She was advised that she would not be successful in taking legal action against the person(s) who had stolen her likeness due to lack of existing legislation. This is not only true for celebrities – only a few months ago GQ reported the case of Owen, a victim of deepfake sextortion: “All they [the police] told me to do was ignore the scammers, even if they harass me non-stop. I can’t do anything about it.” The feeling of hopelessness Owen experienced at the time is widespread among victims of deepfakes, who lack legal protection. The negative consequences of sextortion are exacerbated in the case of deepfakes, because often the police are not actively trying to find the perpetrator (Leukfeldt, Notté, & Malsch, 2019).
This does not mean that a victim has zero recourse: they can potentially bring defamation or harassment lawsuits. But deepfakes are typically posted and consumed by anonymous users, making redressability difficult, if not impossible. Moreover, monetary compensation would not undo damage to mental well-being and reputation. In the EU, the General Data Protection Regulation (GDPR) is generally strict about privacy, but when deepfakes merge images or voices of two or more persons, it is unclear whether the final result would also be subject to the law (van der Sloot & Wagensveld, 2022). Even if citizens have rights through which they can block or remove unlawful content, in practice, it is hard to get platforms to remove content, let alone prevent copies from being shared on other parts of the internet. Legal proceedings often cost money and time that the average person does not have, and may even attract more attention to the content they want removed.
“This is because the online space is more omnipresent, and victims feel that nowhere is safe and that their only option is to withdraw from society altogether. ”
Clearly, stricter and more specific laws are needed. Why does that present such a challenge? Recently, the Dutch Ministry of Justice and Security commissioned a report on the legal challenges of deepfakes. They found a number of difficulties in developing and enforcing laws for deepfake sextortion. First, technology develops rapidly, so technology-specific rules become outdated quickly. Another paper by Yönt (2024) discussing the ‘deepfake menace’ also points out that many judges, prosecutors and lawyers find it difficult to understand technological crimes. Second, due to the cross-border nature of technology, parties are often subject to multiple legal regimes, and tend to locate in the jurisdiction with the lowest regulatory burden. This too was backed by Yönt (2024) – laws are made nationally, while deepfakes are borderless. Third, there is a complex web of parties involved with the production and distribution of deepfakes, and it is not always clear who can be held accountable. Yönt (2024) added that the anonymity of many malicious actors makes them difficult to trace.
Given this legal situation, one can no doubt imagine the overwhelming helplessness that a victim would experience. Leukfeldt et al. (2019) found that when threats are carried out online, their consequences are aggravated. This is because the online space is more omnipresent, and victims feel that nowhere is safe and that their only option is to withdraw from society altogether. When such crimes go unpunished, victims continue to live in fear, vulnerability and distrust (Altholz, 2020; Morris & Scott, 2022). In the case which GQ reported, Owen said that he “was lost for words” and “wanted to bury [himself] alive.” For months, he was scared to leave his house, fearing that people might have seen the video and recognise him. Another victim reported last November by The Times of India was left “mentally disturbed” and was considering suicide. Many such victims experience trauma guilt and emotional dysregulation, both of which mediate PTSD and depression levels (Holladay, 2016).
Even the family members of victims are at increased risk for depression, anxiety, and PTSD due to the lack of closure and resolution (Altholz, 2020; Connolly & Gordon, 2015; Mastrocinque, et al, 2015). Without legal protection, a victim’s fear and uncertainty turn into strong feelings of anger, frustration, general distrust, and hopelessness (Altholz, 2020; Connolly & Gordon, 2015; Reed & Caraballo, 2022). Victims also find themselves coping with feelings of blame, resentment, and guilt – all of which puts additional strain on their relationships (Armour, 2002; Connolly & Gordon, 2015).
Lives, careers, and relationships can be ruined, all with just a decently powerful computer. If left unchecked, deepfakes can lead to cybercrimes with life-wrecking consequences that are unfortunately, all too real.
References
-
Altholz, R. (2020). Living with Impunity: Unsolved Murders in Oakland and the Human Rights Impact on Victims’ Family Members. Social Science Research Network. https://doi.org/10.2139/ssrn.3618573
-
Connolly, J., & Gordon, R. (2014). Co-victims of homicide. Trauma, Violence & Abuse, 16(4), 494–505. https://doi.org/10.1177/1524838014557285
-
Facing reality? Law enforcement and the challenge of deepfakes | Europol. (n.d.). Europol. https://www.europol.europa.eu/publications-events/publications/facing-reality-law-enforcement-and-challenge-of-deepfakes
-
Hao, K. (2021, February 16). Deepfake porn is ruining women’s lives. Now the law may finally ban it. MIT Technology Review. https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/
-
Harris, K. R. (2021). Video on demand: what deepfakes do and how they harm. Synthese, 199(5–6), 13373–13391. https://doi.org/10.1007/s11229-021-03379-y
-
Holladay, K. R. (2016). An investigation of the influence of cyber-sexual assault on the experience of emotional dysregulation, depression, post traumatic stress disorder, and trauma guilt. ELECTRONIC THESES AND DISSERTATIONS. https://stars.library.ucf.edu/etd/5242/
-
Kaate, I., Salminen, J., Santos, J. J., Jung, S., Olkkonen, R., & Jansen, B. J. (2023). The realness of fakes: Primary evidence of the effect of deepfake personas on user perceptions in a design task. International Journal of Human-computer Studies, 178, 103096. https://doi.org/10.1016/j.ijhcs.2023.103096
-
Lucas, K. T. (2022). Deepfakes and domestic violence: Perpetrating intimate partner abuse using video technology. Victims & Offenders, 17(5), 647–659. https://doi.org/10.1080/15564886.2022.2036656
-
Malicious actors manipulating photos and videos to create explicit content and sextortion schemes. (n.d.). https://www.ic3.gov/Media/Y2023/PSA230605#fna
-
Mastrocinque, J. M., Metzger, J., Madeira, J. L., Lang, K., Pruss, H., Navrátil, P., Sandys, M., & Cerulli, C. (2014). I’m still left here with the pain. Homicide Studies, 19(4), 326–349. https://doi.org/10.1177/1088767914537494
-
Reed, M. D., & Caraballo, K. (2021). Voice of the Victims: Accounts of secondary victimization with the court system among homicide co-victims. Journal of Interpersonal Violence, 37(13–14), NP10832–NP10861. https://doi.org/10.1177/0886260521989732
-
Schofield, D. (2024, April 24). “I wanted to bury myself alive” – Inside the rise of male sextortion scams. British GQ. https://www.gq-magazine.co.uk/article/sextortion-scams-deepfake-porn-men
-
The Deepfake Menace: Legal Challenges in the age of AI. (2024, March 19). TRT World Research Centre. https://researchcentre.trtworld.com/discussion-papers/the-deepfake-menace-legal-challenges-in-the-age-of-ai/
-
They appeared in deepfake porn videos without their consent. Few laws protect them. (2023, February 14). NBC News. https://www.nbcnews.com/tech/internet/deepfake-twitch-porn-atrioc-qtcinderella-maya-higa-pokimane-rcna69372
-
Van Der Sloot, B., & Wagensveld, Y. (2022). Deepfakes: regulatory challenges for the synthetic society. Computer Law and Security Report/Computer Law & Security Report, 46, 105716. https://doi.org/10.1016/j.clsr.2022.105716
-
Wodc. (2021, December 29). Deepfakes. https://repository.wodc.nl/handle/20.500.12832/3134
Laws are crucial for addressing deepfakes because they can govern the creation, distribution, and malicious use of deepfakes. Legislation provides a framework to combat this growing threat. Clare McGlynn, a law professor specialising in issues related to pornography and online abuse, emphasises the critical role of legislation in shaping societal norms – it not only deters individuals from creating deepfakes but also sends a clear message about the wrongfulness and harm of such activities. However, the evolution of deepfake technology is outpacing the ability of legal systems to keep up. In the US, no federal legislation currently exists to address the potential threats of deepfakes (Princeton Legal Journal, 2023). In the EU as well, there is no law specifically addressing the regulation of deepfakes (van der Sloot & Wagensveld, 2022).
Last year, NBC reported what happened to internet celebrity QTCinderella who was the subject of hundreds of deepfake pornography videos. She was advised that she would not be successful in taking legal action against the person(s) who had stolen her likeness due to lack of existing legislation. This is not only true for celebrities – only a few months ago GQ reported the case of Owen, a victim of deepfake sextortion: “All they [the police] told me to do was ignore the scammers, even if they harass me non-stop. I can’t do anything about it.” The feeling of hopelessness Owen experienced at the time is widespread among victims of deepfakes, who lack legal protection. The negative consequences of sextortion are exacerbated in the case of deepfakes, because often the police are not actively trying to find the perpetrator (Leukfeldt, Notté, & Malsch, 2019).
This does not mean that a victim has zero recourse: they can potentially bring defamation or harassment lawsuits. But deepfakes are typically posted and consumed by anonymous users, making redressability difficult, if not impossible. Moreover, monetary compensation would not undo damage to mental well-being and reputation. In the EU, the General Data Protection Regulation (GDPR) is generally strict about privacy, but when deepfakes merge images or voices of two or more persons, it is unclear whether the final result would also be subject to the law (van der Sloot & Wagensveld, 2022). Even if citizens have rights through which they can block or remove unlawful content, in practice, it is hard to get platforms to remove content, let alone prevent copies from being shared on other parts of the internet. Legal proceedings often cost money and time that the average person does not have, and may even attract more attention to the content they want removed.
“This is because the online space is more omnipresent, and victims feel that nowhere is safe and that their only option is to withdraw from society altogether. ”
Clearly, stricter and more specific laws are needed. Why does that present such a challenge? Recently, the Dutch Ministry of Justice and Security commissioned a report on the legal challenges of deepfakes. They found a number of difficulties in developing and enforcing laws for deepfake sextortion. First, technology develops rapidly, so technology-specific rules become outdated quickly. Another paper by Yönt (2024) discussing the ‘deepfake menace’ also points out that many judges, prosecutors and lawyers find it difficult to understand technological crimes. Second, due to the cross-border nature of technology, parties are often subject to multiple legal regimes, and tend to locate in the jurisdiction with the lowest regulatory burden. This too was backed by Yönt (2024) – laws are made nationally, while deepfakes are borderless. Third, there is a complex web of parties involved with the production and distribution of deepfakes, and it is not always clear who can be held accountable. Yönt (2024) added that the anonymity of many malicious actors makes them difficult to trace.
Given this legal situation, one can no doubt imagine the overwhelming helplessness that a victim would experience. Leukfeldt et al. (2019) found that when threats are carried out online, their consequences are aggravated. This is because the online space is more omnipresent, and victims feel that nowhere is safe and that their only option is to withdraw from society altogether. When such crimes go unpunished, victims continue to live in fear, vulnerability and distrust (Altholz, 2020; Morris & Scott, 2022). In the case which GQ reported, Owen said that he “was lost for words” and “wanted to bury [himself] alive.” For months, he was scared to leave his house, fearing that people might have seen the video and recognise him. Another victim reported last November by The Times of India was left “mentally disturbed” and was considering suicide. Many such victims experience trauma guilt and emotional dysregulation, both of which mediate PTSD and depression levels (Holladay, 2016).
Even the family members of victims are at increased risk for depression, anxiety, and PTSD due to the lack of closure and resolution (Altholz, 2020; Connolly & Gordon, 2015; Mastrocinque, et al, 2015). Without legal protection, a victim’s fear and uncertainty turn into strong feelings of anger, frustration, general distrust, and hopelessness (Altholz, 2020; Connolly & Gordon, 2015; Reed & Caraballo, 2022). Victims also find themselves coping with feelings of blame, resentment, and guilt – all of which puts additional strain on their relationships (Armour, 2002; Connolly & Gordon, 2015).
Lives, careers, and relationships can be ruined, all with just a decently powerful computer. If left unchecked, deepfakes can lead to cybercrimes with life-wrecking consequences that are unfortunately, all too real.
References
-
Altholz, R. (2020). Living with Impunity: Unsolved Murders in Oakland and the Human Rights Impact on Victims’ Family Members. Social Science Research Network. https://doi.org/10.2139/ssrn.3618573
-
Connolly, J., & Gordon, R. (2014). Co-victims of homicide. Trauma, Violence & Abuse, 16(4), 494–505. https://doi.org/10.1177/1524838014557285
-
Facing reality? Law enforcement and the challenge of deepfakes | Europol. (n.d.). Europol. https://www.europol.europa.eu/publications-events/publications/facing-reality-law-enforcement-and-challenge-of-deepfakes
-
Hao, K. (2021, February 16). Deepfake porn is ruining women’s lives. Now the law may finally ban it. MIT Technology Review. https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/
-
Harris, K. R. (2021). Video on demand: what deepfakes do and how they harm. Synthese, 199(5–6), 13373–13391. https://doi.org/10.1007/s11229-021-03379-y
-
Holladay, K. R. (2016). An investigation of the influence of cyber-sexual assault on the experience of emotional dysregulation, depression, post traumatic stress disorder, and trauma guilt. ELECTRONIC THESES AND DISSERTATIONS. https://stars.library.ucf.edu/etd/5242/
-
Kaate, I., Salminen, J., Santos, J. J., Jung, S., Olkkonen, R., & Jansen, B. J. (2023). The realness of fakes: Primary evidence of the effect of deepfake personas on user perceptions in a design task. International Journal of Human-computer Studies, 178, 103096. https://doi.org/10.1016/j.ijhcs.2023.103096
-
Lucas, K. T. (2022). Deepfakes and domestic violence: Perpetrating intimate partner abuse using video technology. Victims & Offenders, 17(5), 647–659. https://doi.org/10.1080/15564886.2022.2036656
-
Malicious actors manipulating photos and videos to create explicit content and sextortion schemes. (n.d.). https://www.ic3.gov/Media/Y2023/PSA230605#fna
-
Mastrocinque, J. M., Metzger, J., Madeira, J. L., Lang, K., Pruss, H., Navrátil, P., Sandys, M., & Cerulli, C. (2014). I’m still left here with the pain. Homicide Studies, 19(4), 326–349. https://doi.org/10.1177/1088767914537494
-
Reed, M. D., & Caraballo, K. (2021). Voice of the Victims: Accounts of secondary victimization with the court system among homicide co-victims. Journal of Interpersonal Violence, 37(13–14), NP10832–NP10861. https://doi.org/10.1177/0886260521989732
-
Schofield, D. (2024, April 24). “I wanted to bury myself alive” – Inside the rise of male sextortion scams. British GQ. https://www.gq-magazine.co.uk/article/sextortion-scams-deepfake-porn-men
-
The Deepfake Menace: Legal Challenges in the age of AI. (2024, March 19). TRT World Research Centre. https://researchcentre.trtworld.com/discussion-papers/the-deepfake-menace-legal-challenges-in-the-age-of-ai/
-
They appeared in deepfake porn videos without their consent. Few laws protect them. (2023, February 14). NBC News. https://www.nbcnews.com/tech/internet/deepfake-twitch-porn-atrioc-qtcinderella-maya-higa-pokimane-rcna69372
-
Van Der Sloot, B., & Wagensveld, Y. (2022). Deepfakes: regulatory challenges for the synthetic society. Computer Law and Security Report/Computer Law & Security Report, 46, 105716. https://doi.org/10.1016/j.clsr.2022.105716
-
Wodc. (2021, December 29). Deepfakes. https://repository.wodc.nl/handle/20.500.12832/3134