The Use of Artificial Intelligence as a Form of Sexual Violence


The events surrounding the release of AI-generated pornographic content of internet personality Imane Anys (better known by her alias, Pokimane) at the beginning of the year brought to the forefront of social media a discussion that few in the world of technology have been willing to entertain. Anys is not the only one to have been victimized. Numerous female streamers had their likenesses stolen to use in deepfake pornography. While those in the know have been warning the public of the possible numerous and dangerous uses of neural network image generations for several years,  the progress at which artificial intelligence has developed in the past five years has been staggering, and the emergence of products such as Chat-GPT has illuminated to those outside of the bubble of Silicon Valley what AI is already capable of. 

This collective ignorance has unfortunately left many vulnerable to an existing form of exploitation they were previously unaware of. The issue of deepfake (altered media that swaps one face for another) pornography has only worsened as the technology to create deepfakes has become more accessible. A 2019 study by Deeptrace Labs found that, between December 2018 and October 2019, there was a 100% increase in the amount of deep-fake content available on streaming sites and pornography platforms, and (most disturbingly) 96% of deepfaked content was pornographic, typically using the likenesses of female celebrities. These statistics come from when the practice of making deepfakes was still relatively rare- in the years since, both the demand for and supply of AI-generated pornography has soared, by 2022 the number of deepfaked porn videos had increased by more than 6x the number identified in the 2019 report. The problem has run rampant with little regulation or legal input, likely due to the exponential rate at which it has developed. 

It should come as no surprise that when given tools with the potential to exploit women further, a patriarchal society will do just that. Moreover, these tools have become ever-more accessible, with websites that peddle the promise of ‘virtually undressing’ women like the one reported on by the Huffington Post in 2021 being created in droves. Emerging reports of the use of generative AI to create child sexual abuse materials in the UK have only highlighted the myriad of risks this technology presents to the safety of women and girls when in the hands of opportunistic abusers. This material is being created and distributed faster than it can be detected and legislated against, and the open-source nature of the software used (such as the ‘Stability AI’ software mentioned in the BBC article) allows for another website or content creator to replace those that have successfully been taken down easily. The epidemic of user-generated ‘revenge pornography’ is, unfortunately, only in its infancy. 

Revenge pornography is a crime across the United Kingdom, defined by the Criminal Justice and Courts Act 2015 as the disclosure of ‘.. a private sexual photograph or film if the disclosure is made without the consent of an individual who appears in the photograph or film, and with the intention of causing that individual distress’ in England and Wales, with similar definitions used in Scotland and Northern Ireland. This also rings true in the United States, where forty-eight of fifty states have explicitly criminalized this form of revenge pornography. Theoretically, the mechanisms (however inelegant and unrefined they may be) are in place for victims to seek justice- in parts of the Western world, at least. 

However, in cases such as Anys’s, one may argue that ‘revenge pornography’ is a misnomer. Not much of the element of ‘revenge’ exists in these cases; no party felt they had been wronged and elected to create those materials to distress her as a form of revenge. While the intent to distress may have been an element in creating and distributing that media, it was not the sole or even primary motivator. The other factors at play here -the pursuit of sexual gratification, the desire to objectify an unwilling subject, entitlement to the female body, and so on- arguably have a much larger role to play. Certain cases may undoubtedly be fuelled by a desire to distress the victim due to an interpersonal issue, but to claim that all non-consensual AI-generated material is ‘revenge pornography’ reduces a complex issue down to a single, often unrelated cause. The label of ‘pornography’ also implies some sense of participation from the subject of the materials, as though it were a self-recorded tape uploaded onto a seedy website and not a fictionalized representation of them. Already in the US, cases such as that of ‘Lauren’ detailed in an article from last June lay out the shortcomings of treating deep fake pornography as a subsidiary of ‘revenge pornography’, as these incidents are often unable to stand up to the legal definition of the offense. In addition, the Online Safety Act 2023 in the UK seeks to explicitly criminalize the production and distribution of deep fake pornography to prevent such situations, but the issue of terminology remains central to ensuring victims can pursue justice globally. ‘Revenge pornography’ is already an archaic label -one dripping with contempt for the survivor- now, it poses a serious risk of alienating an entirely new group from being able to properly seek justice.

However, this situation is evolving rapidly and solutions such as legal regulation and amended terminology may not be enough to shove what the tech industry has released back into Pandora’s box. The trauma inflicted upon sufferers of AI sexual image abuse has already occurred, and the heads of this hydra only grow back faster once cut off.

Furthermore, some international organizations have opted to adopt a ‘head-in-the-sand’ approach to the issue, as though ignoring the possibility of such uses of AI will somehow make them less widespread. The EU’s landmark AI Act notably does not mention sexual image creation or any form of sexual abuse in its regulations on the use of AI within member states, despite a prolific instance of the use of AI to create child sexual abuse images in Spain. Ignoring the problem will not make it go away. Much like the aforementioned hydra, new ones will appear that would have never emerged had the initial problem been addressed. If Spanish schoolchildren had not freely created and distributed pornographic images of their classmates, there would be less of the artificially generated child sexual abuse material that independent watchdogs such as the Internet Watch Foundation have warned of. While this material is already declared illegal by laws criminalizing the possession of other materials depicting the abuse of fictional children (i.e drawings), investigators are now faced with the task of determining which children are real and which have been created by AI; another issue borne out of an initial lack of preventative action.

Though it may sound defeatist to say, global legal and legislative infrastructures have already failed. A lack of preparation or even awareness of the possibility of a new form of sexual violence materializing out of technological developments has resulted in a universal cluelessness regarding how to solve this problem. If the law has to react to something, it has already failed to prevent it and protect the population. Now, the countdown to rectify these mistakes has begun. The regulation and restriction of these technologies must proceed with the safety of women and girls in mind, as legislatures and courts attempt to play catch-up and prevent further damage from being done.


References:

Ajder, H., Patrini, G., Cavalli, F. and Cullen, L. (2019). The State of Deepfakes. [online] Amsterdam: Deeptrace. Available at: https://regmedia.co.uk/2019/10/08/deepfake_report.pdf

Brandt, O. (2023) Streamer apologises for deepfake porn scandal , News.Com.AU. Available at: https://www.news.com.au/technology/gaming/streamer-apologises-for-deepfake-porn-scandal/news-story/902d8b4e50f8de61cc4247ff85efe9bc

Crawford, A. and Smith, T. (2023). Illegal trade in AI child sex abuse images exposed. BBC News. [online] Available at: https://www.bbc.com/news/uk-65932372

McGlynn, C. (2023) Intimate image abuse: Terminology, Clare McGlynn. Available at: https://www.claremcglynn.com/intimate-image-abuse

Milmo, D. (2023) ‘Age checks, trolls and deepfakes: what’s in the online safety bill?’, The Guardian. Available at: https://www.theguardian.com/technology/2023/jan/17/age-checks-trolls-and-deepfakes-whats-in-the-online-safety-bill

Milmo, D. (2023). ‘AI-created child sexual abuse images ‘threaten to overwhelm internet’’. The Guardian. [online]. Available at: https://www.theguardian.com/technology/2023/oct/25/ai-created-child-sexual-abuse-images-threaten-overwhelm-internet

Powell, A., Flynn, A. and Henry, N. (2018). ‘AI can now create fake porn, making revenge porn even more complicated’. The Conversation. Available at: https://theconversation.com/ai-can-now-create-fake-porn-making-revenge-porn-even-more-complicated-92267

Viejo, M. (2023). ‘In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’’. El Pais. [online] Available at: https://english.elpais.com/international/2023-09-18/in-spain-dozens-of-girls-are-reporting-ai-generated-nude-photos-of-them-being-circulated-at-school-my-heart-skipped-a-beat.html


About the author:

Farah Alaradi is a first-year Law and Anthropology student at the London School of Economics and Political Science, originally from Bahrain and Morocco. She hopes to pursue a career in public international and human rights law after graduation and is particularly passionate about amplifying the voice of SWANA natives within international humanitarian law as well as the prevention of VAWG and FGM globally. Farah Alaradi believes that the law is at its core a tool of prevention and protection, and the global failure to apprehend the emerging use of AI to sexually violate women and girls is the antithesis of that.

Previous
Previous

Bleeding and Birthing in Conflict: Women's Health Crisis Escalates in Gaza 

Next
Next

16 Days of Activism: Gender-Based Violence in Times of War and Crisis