Wrongness, Personhood, and Image-Based Sexual Abuse: Placing Technology-Facilitated Violence in the Realm of Justified Criminal Law

Wrongness, Personhood, and Image-Based Sexual Abuse

Placing Technology-Facilitated Violence in the Realm of Justified Criminal Law

In an era where digital technology has become intertwined with every aspect of daily life, im­age-based sexual abuse and technology-facilitated sexual violence have emerged as signifi­cant legal and societal issues. Whilst statistics are scarce, some available numbers from 2017 suggest that 15% of adult women and 7% of adult men have experienced image-based sexual abuse. These numbers are likely to have increased since then, not least because of the devel­op­ment of deepfake technology. For example, research from 2023 shows that 98% of deep­fakes are pornographic, and 99% of those pornographic deepfakes depict women. Increas­ingly, the laws of liberal democracies are being changed to address image-based sexual abuse and other forms of technology-facilitated violence. However, these laws highlight the gaps in the legal systems of these democracies regarding the recogni­tion and treatment of online wrongs as genuine wrongs against persons. This can result in rather moderate crimi­nal­i­sa­tions of very severely harmful and wrongful acts. To help create a fuller understanding of the wrongness of technology-facilitated violence, this research project explores the theoretical foundations of justified criminalisa­tion against the backdrop of the digital age.
This project considers the theoretical foundations of the criminal law in light of potential jus­ti­fied criminalisation of technology-facilitated abuse. It offers an in-depth analysis of the wrong­ness of image-based sexual abuse, builds towards a theory of personhood for the digital age, and identifies technological developments that warrant legal attention. The research sits at the intersection of criminalisation theory, personhood theory, human rights law, and gender-based violence. Its outcome will assist legislators and academics in developing and analysing the justified criminalisation of technology-facilitated wrongs.

 

Expected outcome: monograph, scientific articles
Research focus:II. Regulating Intimate Relations
Project languages:English, Dutch, German
Photo:© AdobeStock.com/Photocreo Bednarek

 

Key publications

Goudsmit Samaritter, M. (2024). Deepfake pornografie: neppe foto’s, echte afbeeldingen - analyse en aanbevelingen over de strafbaarstelling van seksueel misbruik met beeldmateriaal in art. 139h Sr. Ars Aequi: juridisch maandblad, 73(06), 497–506.
Goudsmit Samaritter, M., Aksay, R., & Oerlemans, J.-J. (2023). Strafbaarstelling van seksuele deepfakes. Boom Strafblad, (5), 239–247. doi:10.5553/BSb/266669012023004005007

More


Academic activities

  • Workshop: Online sexual violence, Slachtofferhulp, 09/01/2023.
  • Workshop discussant: Inter­disci­pli­nary Human Rights Graduate Work­shop, University of Oxford, 11/28/2023.
  • Author roundtable: Legal Interests and Public Goods in Criminal Law, Leiden University, 05/29/2024.
  • Conference: Law and/versus Tech­nol­ogy: Trends for the new decade, Leiden University, 06/20–21/2024.

Academic outreach / public engagement

  • Op-ed: Deepfake-porno is ernstig seksueel misbruik, maar dat beseft niet iedereen, Trouw, 03/30/2024.
  • Radio Interview: John Pienaar with Times Radio Drive, Times Radio, 04/16/2024.
  • Press article: Jildou Beiboer and Jennifer Mol, Deepfakeporno afrekenen? Kan gewoon via reguliere betaaldiensten, Financieel Dagblad, 06/10/2024.

Other Interesting Articles

Go to Editor View