The White House vows to take action on deepfakes after lewd AI images of Taylor Swift circulate

vt-author-image

By Kim Novak

Article saved!Article saved!

The White House has doubled down on its promise to tackle 'deepfake' images showing real people in fake sexually explicit images after Taylor Swift fell victim to it this week. 

It comes after Taylor was said to be considering taking legal action after explicit deepfake images of the 34-year-old singer at a Kansas City Chiefs game began popping up on X (formerly known as Twitter).

Taylor, who is dating the team's tight end Travis Kelce, was said to have been left "furious" after the images emerged, and the hashtag #ProtectTaylorSwift soon started trending,

A source close to Taylor told the Daily Mail: "Whether or not legal action will be taken is being decided, but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge."

wp-image-1263246411 size-full
Credit: Gareth Cattermole/WireImage for Parkwood/Getty Images

They added: "The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with. These images must be removed from everywhere they exist and should not be promoted by anyone.

"Taylor’s circle of family and friends are furious, as are her fans obviously. They have the right to be, and every woman should be. The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted."

Thankfully, it seems that this gross violation of Taylor's privacy has brought the issue to the forefront with the White House, as it has vowed to take action to prevent these incidents from occurring.

White House Press Secretary Karine Jean-Pierre told ABC News that the incident is "alarming" and that Congress "should take legislative action".

Jean-Pierre added: "We are alarmed by the reports of the…circulation of images that you just laid out - of false images to be more exact, and it is alarming.

"While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people."

She highlighted the action the administration has recently taken, including launching a task force to address online harassment and abuse and the Department of Justice launching the first national 24/7 helpline for survivors of image-based sexual abuse.

wp-image-1263246412 size-full
The images were edited from photos of Taylor at the Kansas City Chiefs games. Credit: RJ Sangosti/MediaNews Group/The Denver Post via Getty Images)

While there is currently no federal law in the US to prevent or deter a person from creating or sharing deepfake images without the subject's consent, Rep. Joe Morelle last week renewed a push to pass a bill making the non-consensual sharing of digitally altered explicit images a federal crime, which would come with fines and jail time for perpetrators.

A spokesperson for Morelle told the outlet: "We're certainly hopeful the Taylor Swift news will help spark momentum and grow support for our bill, which as you know, would address her exact situation with both criminal and civil penalties."

Meta also spoke out after the images went viral, telling the Daily Mail: "This content violates our policies, and we’re removing it from our platforms and taking action against accounts that posted it."

One post of the fabricated images was reportedly viewed over 45 million times before the account was suspended on Thursday, with X's safety team saying on Friday that it was "actively removing all identified images" and "taking appropriate actions against the accounts responsible for posting them."

Their statement read: "Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We're committed to maintaining a safe and respectful environment for all users."

Featured image credit: Lionel Hahn/Getty Images

The White House vows to take action on deepfakes after lewd AI images of Taylor Swift circulate

vt-author-image

By Kim Novak

Article saved!Article saved!

The White House has doubled down on its promise to tackle 'deepfake' images showing real people in fake sexually explicit images after Taylor Swift fell victim to it this week. 

It comes after Taylor was said to be considering taking legal action after explicit deepfake images of the 34-year-old singer at a Kansas City Chiefs game began popping up on X (formerly known as Twitter).

Taylor, who is dating the team's tight end Travis Kelce, was said to have been left "furious" after the images emerged, and the hashtag #ProtectTaylorSwift soon started trending,

A source close to Taylor told the Daily Mail: "Whether or not legal action will be taken is being decided, but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge."

wp-image-1263246411 size-full
Credit: Gareth Cattermole/WireImage for Parkwood/Getty Images

They added: "The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with. These images must be removed from everywhere they exist and should not be promoted by anyone.

"Taylor’s circle of family and friends are furious, as are her fans obviously. They have the right to be, and every woman should be. The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted."

Thankfully, it seems that this gross violation of Taylor's privacy has brought the issue to the forefront with the White House, as it has vowed to take action to prevent these incidents from occurring.

White House Press Secretary Karine Jean-Pierre told ABC News that the incident is "alarming" and that Congress "should take legislative action".

Jean-Pierre added: "We are alarmed by the reports of the…circulation of images that you just laid out - of false images to be more exact, and it is alarming.

"While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people."

She highlighted the action the administration has recently taken, including launching a task force to address online harassment and abuse and the Department of Justice launching the first national 24/7 helpline for survivors of image-based sexual abuse.

wp-image-1263246412 size-full
The images were edited from photos of Taylor at the Kansas City Chiefs games. Credit: RJ Sangosti/MediaNews Group/The Denver Post via Getty Images)

While there is currently no federal law in the US to prevent or deter a person from creating or sharing deepfake images without the subject's consent, Rep. Joe Morelle last week renewed a push to pass a bill making the non-consensual sharing of digitally altered explicit images a federal crime, which would come with fines and jail time for perpetrators.

A spokesperson for Morelle told the outlet: "We're certainly hopeful the Taylor Swift news will help spark momentum and grow support for our bill, which as you know, would address her exact situation with both criminal and civil penalties."

Meta also spoke out after the images went viral, telling the Daily Mail: "This content violates our policies, and we’re removing it from our platforms and taking action against accounts that posted it."

One post of the fabricated images was reportedly viewed over 45 million times before the account was suspended on Thursday, with X's safety team saying on Friday that it was "actively removing all identified images" and "taking appropriate actions against the accounts responsible for posting them."

Their statement read: "Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We're committed to maintaining a safe and respectful environment for all users."

Featured image credit: Lionel Hahn/Getty Images