X blocks Taylor Swift searches amid AI surge of fake graphic images
Taylor Swift continued to be the most talked about woman on the Internet on Sunday, when the Kansas City Chiefs were declared Super Bowl bound.
Despite Travis Kelce and Swift being at the forefront of trends, X still had her name blocked from the search function on its platform.
The decision to keep Taylor's name unsearchable on X began earlier in the week after it was discovered that sexually explicit AI-generated images were circulating on the social media network.
TAYLOR SWIFT AI-GENERATED EXPLICIT PHOTOS OUTRAGE FANS: ‘PROTECT TAYLOR SWIFT’
Representatives for X did not immediately respond to Fox News Digital's request for comment.
As of Monday morning, Swift's name was still not searchable on the platform. Users could still search for "Taylor" and "Swift" individually.
The fake pornographic images made the rounds on Thursday, with X suspending at least one account associated with the images.
TAYLOR SWIFT'S ALLEGED STALKER ACCUSED OF VISITING HER NYC HOME 30 TIMES
X seemed to respond to the backlash with a statement shared by their safety team notifying users that posting "non-consensual nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content."
"Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them," they said. "We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We're committed to maintaining a safe and respectful environment for all users."
The White House reacted Friday to the explicit images that went viral of Swift and leaned on Congress for a legislative crackdown.
"We are alarmed by the reports of the circulation of images … of false images to be more exact. And it is alarming," White House press secretary Karine Jean-Pierre said during a press briefing.
"So while social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual, intimate imagery of real people."
The SAG-AFTRA actors union denounced the false images of Swift as "upsetting" and "deeply concerning."
"The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal," the union said in a statement. "As a society, we have it in our power to control these technologies, but we must act now before it is too late."
They added, "We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy."