New Delhi, India
The entire internet erupted with anger after doctored pictures of Taylor Swift started circulating on different social media platforms. Earlier this week, Swift's pornographic images showing the singer in a very objectionable position during a Kansas City Chiefs game went viral, which triggered a massive backlash.
Presumably, to prevent people from searching for Taylor's images, it seems like X (formally Twitter) has taken a few steps. Amidst all this, a few changes have been made to the micro-blogging site and one change that netizens have found out is that they are not able to search Taylor Swift on the platform.
SAG-AFTRA calls Taylor Swift's explicit AI photos 'harmful', says ''must be made illegal’
If you search for Taylor Swift after the explicit photos went viral, you will get an error message. The message that appears on the screen reads, “Something went wrong, but don’t fret – it’s not your fault.”
However, when you search for words like 'Taylor', 'Swift', 'Taylor and Swift,' the result appears in the normal way.
The change has been made after several fake AI-generated images of Taylor went viral. As per the Daily Mail, the sexualised images of Taylor were created by a website known as Celeb Jihad, a deepfake porn site that has shared several faked photos of the A-listers.
Fan breaks Guinness World Record for identifying most Taylor Swift songs in a minute
Soon after the images went viral, many netizens took action and started urging social media users to report the images, as they demanded legal action for this heinous act.
Taylor Swift has not yet reacted to the matter. However, a source told the Daily Mail that Swift is “furious” over her nude AI-generated images.
“Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” a source told the Daily Mail.
SAG-AFTRA on Taylor Swift's explicit AI photos
On Friday, SAG-AFTRA (Screen Actors Guild-American Federation of Television and Radio Artists) called Swift's pornographic AI photos 'harmful'.
In the statement released, the union representing Hollywood actors said, “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal.”
“As a society, we have it in our power to control these technologies, but we must act now before it is too late,” the union said.