The results offered by artificial intelligence have brought with them many good things, but also worrying ones, and an example of this are the complaints from more and more women who are victims of the misuse of this tool. Specifically, many of them, most of them famous, They regret the circulation of images in which they appear naked, but they are false.
The influencer Marina Rivers this Tuesday in EveningAR, since it has come to be affected by the circulation of this type of snapshots: “Photos taken from the Instagram wall are spread on Telegram, which if you look a little you can see that they are false. “I get a lot of messages saying that there are naked photos of me, and it’s dangerous.”
Some statements that he shared after the complaint by the minors from Almendralejo (Badajoz) about the images disseminated of their alleged nudes. “In the end we always suffer from the shit they create with artificial intelligence. As wonderful as it can be, and what they do is create nudes or put your face in porn videos to keep your images,” he complained.
However, the tiktoker is not the only personality who has suffered the consequences of the so-called deepfakes. The most famous case in our country -and from which this topic began to be talked about- It starred our most international artist: Rosalia.
A singer spread a pornographic image of the Catalan artist last May, which generated a stir on the Internet due to the seriousness of the matter and the reaction of the performer, who stated that the body of a woman “it is not public property” nor “a commodity” for marketing strategies.
“Those photos were edited and you created a false narrative around it when I don’t even know you. There is something called consent and to all of you who found it funny or plausible, I sincerely hope that one day you learn that you come from a woman, that women are sacred and that we must be respected,” she said on Twitter.
Aitana, Mónica Naranjo, Rigoberta Bandini, India Martínez, Angy Fernández or Beatriz Luengo were some of the artists who took a stand in favor of the author of Motomamiwhich even received the support of the Minister of Equality, Irene Montero.
And the same thing happened, last August, to the influencer Laura Escanes. “I received a link where there are naked photos of me edited and created by AI -artificial intelligence-. Apart from feeling totally used and exposed, there is something that makes my blood boil: a woman’s body is not used. Not even for pleasure, neither to abuse nor to manipulate. “I am repelled by the person who created them, but also by those who are there and think it is funny and remain silent,” the businesswoman complained.
This type of content is called deepfakesa concept that comes from fake (false) and deep (learning, learning in spanish), and It includes the set of machine learning algorithms used by artificial intelligence.
Beyond our borders, other celebrities such as Emma Watson have been victims of this type of montages, published under the acronym NSFW Not suitable for work (Not appropriate for work). Their acronym indicates that they are related to sensitive material because it is controversial, violent, pornographic or potentially offensive.