Taylor Swift searches unblocked on X after AI scandal

San Francisco, California - After Taylor Swift-related searches on X were disabled to prevent the spread of abusive AI-generated images of the star, the platform said it had lifted the block.

X lifted a block on searches for Taylor Swift on the platform, but promised to stay "vigilant" and prevent the spread of abusive AI-generated images.
X lifted a block on searches for Taylor Swift on the platform, but promised to stay "vigilant" and prevent the spread of abusive AI-generated images.  © Collage: REUTERS

Bloomberg reporter Kurt Wagner cited Joe Benarroch, head of business operations for X, who said: "Search has been re-enabled and we will continue to be vigilant for attempts to spread this content and will remove it wherever we find it."

X had put a temporary block on searches using Swift's name in the wake of criticism by her fans, the White House, and others over the spread of the abusive images.

The company did not respond to a request for comment, but The Verge quoted head of business Joe Benarroch as saying the block on Swift searches was a temporary measure intended to "prioritize safety."

Angelina Jolie calls Brad Pitt's latest legal move "abusive" amid winery war
Angelina Jolie Angelina Jolie calls Brad Pitt's latest legal move "abusive" amid winery war

One fake image of the singer was seen 47 million times on X before it was removed Thursday. The post was reportedly live on the platform for around 17 hours.

"It is alarming," said White House Press Secretary Karine Jean-Pierre, when asked about the images on Friday.

X struggles to stem flow of toxic content

Deepfake images of celebrities are not new, but activists and regulators are worried that easy-to-use AI tools will create an uncontrollable flood of toxic or harmful content.

X is one of the biggest platforms for porn content in the world, analysts say, as its policies on nudity are looser than Meta-owned platforms Facebook or Instagram.

In a statement last week, X said that "posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content."

The Elon Musk-owned platform said that it was "actively removing all identified images and taking appropriate actions against the accounts responsible for posting them."

Cover photo: Collage: REUTERS

More on Taylor Swift: