Twitter, the world’s former public stage turned Stormfront alternative and porn site not long after Elon Musk’s takeover, is facing new legal trouble over Grok’s penchant for turning any photo into goon material at a user’s request. Pornographic deepfakes are nothing new, we’ve covered attempts to regulate it in the past, but Grok’s near omnipresence combined with the need for Twitter users to control women’s bodies has made it very easy to sexually harass women and children online. Given that the world’s foremost free speech absolutist only seems to give a damn when someone acts in a way that directly harms him, there needs to be some external pressure if there’s any hope of people posting pictures of their spouse or child on Twitter without @Lowkirkenuinely65 asking Grok to put her in a saran wrap bikini. Thankfully, California is stepping up to bat. The Guardian has coverage:
“The avalanche of reports detailing the non-consensual, sexually explicit material that xAI has produced and posted online in recent weeks is shocking,” California attorney general, Rob Bonta, said in a statement. “I urge xAI to take immediate action to ensure this goes no further.”
Bonta’s office is investigating whether and how xAI violated state law.
On X, California governor Gavin Newsom called for an investigation into “Grok’s disgusting spread of child porn on this website”.
Musk responded that there has been no use of the stripping tactic to turn photos of children into lewd images. One of the most vocal dissenters to that stance has been Grok itself.
Damn, do any of Elon’s children have his back?
The undress feature was reworked to only work for the fools that paid for Twitter premium. When this didn’t fix the problem, the feature was apparently removed for everyone:
Let’s see how long this fix stays up — my money is on someone circumventing safety protocols by asking for the nudes in iambic pentameter. Godspeed, California.
California Attorney General Investigates Musk’s Grok AI Over Lewd Fake Images [The Guardian]
Earlier: Act Protecting Adults From Deepfaked Porn To Be Signed Into Federal Law
Washington State Takes A Stand Against Deepfake Porn
Chris Williams became a social media manager and assistant editor for Above the Law in June 2021. Prior to joining the staff, he moonlighted as a minor Memelord™ in the Facebook group Law School Memes for Edgy T14s . He endured Missouri long enough to graduate from Washington University in St. Louis School of Law. He is a former boatbuilder who is learning to swim, is interested in critical race theory, philosophy, and humor, and has a love for cycling that occasionally annoys his peers. You can reach him by email at cwilliams@abovethelaw.com and by tweet at @WritesForRent.
The post California To Investigate If xAI Broke The Law With Easily Accessible Porn Deepfakes Of Women And Minors appeared first on Above the Law.