NFL
The Horrifying AI Pictures That Taylor Swift Wants to Erase from the Internet and Threatens to S.u.e (video)
Taylor Swift is one of the most recognized figures of the moment. However, that success is a double-edged sword because just as it has brought her fame and fortune, it also often causes her headaches, such as the disturbing images generated by AI that have been circulating on the internet for days.
The images show the singer in hypersexualized situations, most focused on her recent relationship with the Kansas City Chiefs, through her boyfriend, tight end Travis Kelce.
Press reports claim that the images were generated on the infamous porn site Celeb Jihad, but from there, they soon made the jump to social media. For a couple of days, these images have appeared repeatedly on the timelines of X and Instagram, provoking the fury of the singer’s fans.
Swifties call to arms
A multitude of posts have expressed their outrage at the images. One of the most eloquent asks, “How is this not considered sexual assault? I cannot be the only one who is finding this weird and uncomfortable?”
And it continues: “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable. How are there no regulations or laws preventing this?”
Another made a forceful call: “I’m gonna need the entirety of the adult Swiftie community to log into Twitter, search the term ‘Taylor Swift AI,’ click the media tab, and report every single AI-generated pornographic photo of Taylor that they can see because I’m f*cking done with this BS. Get it together Elon!”
They were not the only ones, another one expressed, “Whoever is making this garbage needs to be arrested. What I saw is just absolutely repulsive, and this kind of s*ht should be illegal … we NEED to protect women from stuff like this.”
The USA does have legislation on the matter
Although regulation is lacking at the federal level, states such as Georgia, Hawaii, Virginia, New York, Minnesota, and Texas have laws against so-called deepfake pornography. In California and Illinois, victims can sue creators for defamation.
The infamous site where the images were generated is under constant attack from NGOs and women rights activists, but continues to operate, while the social media outlets where the images are spread, such as Instagram, Reddit, 4Chan, and X, have been overwhelmed by the avalanche of images.
Swift nor her representatives have commented on the recent intrusion. Nor have the social media where the images have been seen issued statements on the matter, so we will have to wait and see how this controversy develops.