Victim of pornographic deepfake, Alexandria Ocasio-Cortez believes ‘it’s similar to rape’

Victim of pornographic deepfake, Alexandria Ocasio-Cortez believes ‘it’s similar to rape’

Democrat Alexandria Ocasio-Cortez was the victim of a pornographic montage that went viral on X. In an interview with Rolling Stone, she spoke about this traumatic experience.

A horror scene. During a car trip with her council, the Democrat Alexandria Ocasio-Cortez, the youngest member ever elected to the United States Congress, lived the traumatic experience of discovering that she was a victim of Deepfake Porn, a multimedia synthesis technique that exploits artificial intelligence superimpose a person’s face onto existing pornographic images. In an interview with the magazine Rolling Stonehas engaged in this cyber violence of incredible cruelty.

There is nothing “imaginary” about deepfakes

The parliamentarian explained that being faced with a representation of herself performing a sexual act made her understand that seeing her image deviated for pornographic purposes is not at all so abstract or “imaginary compared to what people want us to believe”. Citing studies on the subject, AOC recalled that the human brain has great difficulty distinguishing true from false, even when it knows that what it is analyzing is false. He added that he replayed the images over and over in his head for the rest of the day.

“It’s shocking to see images of yourself that someone else might think are real”added Alexandria Ocasio-Cortez to Rolling Stone. “Being a survivor of physical sexual assault adds a level of dysregulation. It brings back trauma, while I’m… in the middle of a fucking meeting.” In your opinion, “Digitize this violent humiliation” it’s like committing physical rape, and it’s a safe bet “Some people will go so far as to commit suicide after being victimized.”

An insufficient legislative arsenal

Earlier this year, a wave of pornographic deepfakes targeted American singer Taylor Swift, bringing to the fore the urgent need to regulate this dangerous practice, which today falls completely into legal trouble. As recalled by CaretakerOcasio-Cortez also supports a bill in Congress – known as Stop Explicit Infringing Images and Non-Consensual Editing Act of 2024 – aimed at enabling “make it easier for victims of non-consensual AI pornography to report publishers, distributors and consumers of X-rated digital counterfeits”.


In France, many female public figures, such as Salomé Saqué or Léna Situations, have also been victims of this type of cyberviolence, which is not so much aimed at deceiving users (led into thinking that they are real images), but at exercising a control over women’s bodies, as journalist Lucie Ronfaut explains in an article published on the Numerama website.

Even in France, however, the law remains fallible in the face of this plague. Since today it is extremely difficult to identify the authors deepfakeit remains almost impossible to punish them with the current legal arsenal.


What if the movie you were going to see tonight was a dump? Each week, Kalindi Ramphul gives you her opinion on which movie to see (or not) on the show The Only Opinion That Matters.

Source: Madmoizelle

Leave a Reply

Your email address will not be published. Required fields are marked *

Top Trending

Related POSTS