The social network X, formerly known as Twitter, has taken action to make lewd AI images of Taylor Swift unsearchable after they circulated online.
The singer, 34, was reportedly left "furious" and considering legal action after images of her were digitally manipulated to appear to be in explicit poses.
After they were shared by an account on the platform they soon went viral and even reached as far as the White House, which vowed to take action to prevent deepfakes like these from being shared.
Now, X has made the decision to temporarily suspend people from being able to search for them, with terms including 'Taylor Swift' bringing up an error message.
People attempting to search the star on X will be met with a "something went wrong" message as the platform has made her unsearchable as it attempts to stop the spread of the images.
X's head of business operations, Joe Benarroch, told the Wall Street Journal in a statement: "This is a temporary action and done with an abundance of caution as we prioritize safety on this issue."
Entertainment labor union SAG-AFTRA had also spoken out to condemn the AI images of Taylor, saying in a statement: "The sexually explicit, A.I. generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning.
"The development and dissemination of fake images - especially those of a lewd nature - without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late."
The White House has also shared its commitment to cracking down on AI technology being abused to create such vile imagery of real people.
White House press secretary Karine Jean-Pierre said: "We are alarmed by the reports of the circulation of images that you just laid out… There should be legislation, obviously, to deal with this issue."
The deepfake images of Taylor were reportedly viewed over 27 million times in 19 hours after they were shared by an anonymous account, before the account was suspended, according to NBC News.
X also addressed the images which it was "actively removing", saying in a statement: "Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content.
"Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We’re committed to maintaining a safe and respectful environment for all users."
Taylor has yet to comment publicly on the gross invasion of privacy, however, sources close to the star said she was considering taking legal action.