Taylor Swift has proven herself capable of many things, including toppling monopolies and making mass transit cool again, but her latest endeavor—taking on artificial intelligence—is perhaps her most ambitious yet.
Sexually explicit AI-generated images of Swift have been circulating on social media this week, with one X post attracting more than 45 million views. While the user who shared the images was suspended, the post was up on the site for almost a full day. Now, Swift is reportedly considering legal action. “These fake AI generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge,” a source close to the 34-year-old singer told the Daily Mail.
The issue of deepfakes—i.e. the use of AI to manipulate a person’s physical image—has been gaining traction lately, with U.S. politicians including New York congressman Joseph Morelle speaking out about Swift’s predicament. (Last year, in an attempt to make the sharing of deepfake pornography illegal, Congressman Morelle introduced the Preventing Deepfakes of Intimate Images Act.) “What’s happened to Taylor Swift is nothing new,” wrote New York congresswoman Yvette D. Clarke on X. “This is an issue both sides of the aisle & even Swifties should be able to come together to solve.”
This isn’t the first time Swift has taken a stand against sexual exploitation, of course. In 2017, she counter-sued a Colorado DJ for battery and sexual assault after he groped her at a pre-show meet-and-greet in 2013. (The DJ had initially sued Swift for having him wrongfully terminated from his job.) “I just wanted to say I’m sorry to anyone who ever wasn’t believed because I don’t know what turn my life would have taken if somebody didn’t believe me when I said something had happened to me,” Swift said on the one-year anniversary of her sexual assault trial verdict.
It’s deeply unfortunate that Swift is now weathering the predatory manipulation of her image online, but her reach and power are likely to bring attention to the issue of criminal AI misuse—one that has already victimized far too many people.