As pressure mounts, xAI says Grok will stop undressing people

Published on:

Hours after a coalition of digital rights, child safety, and women’s rights organizations asked Apple to “take immediate action” against X and Grok AI, xAI confirmed that Grok will no longer edit “images of real people in revealing clothing such as bikinis,” with significant carve-outs. Here are the details.

Apple faces renewed pressure to remove X and Grok from the App Store

In recent days, countless X and Grok users have been asking xAI’s chatbot to undress women and even underage girls, based on photos posted to X.

While xAI initially defined the situation as “lapses in safeguards,” Grok kept on complying with multiple requests to edit images in such a way. This, in turn, led X to be blocked in several countries, and xAI to become the target of investigations in others.

In the meantime, Apple has been facing renewed pressure to remove the X and Grok apps from the App Store, from both senators and users. Earlier today, a coalition of 28 digital rights, child safety, and women’s rights organizations submitted open letters to Apple and to Google, asking both companies “to take immediate action to ban Grok, the large language model (LLM) powered by xAI,” from their app stores.

From the open letter:

We, the undersigned organizations, write to urge Apple leadership to take immediate action to ban Grok, the large language model (LLM) powered by xAI, from Apple’s app store. Grok is being used to create mass amounts of nonconsensual intimate images (NCII), including child sexual abuse material (CSAM)—content that is both a criminal offense and in direct violation of Apple’s App Review Guidelines. Because Grok is available on the Grok app and directly integrated into X, we call on Apple leadership to immediately remove access to both apps.

And

As it stands, Apple is not just enabling NCII and CSAM, but profiting off of it. As a coalition of organizations committed to the online safety and wellbeing of all—particularly women and children—as well as the ethical application of artificial intelligence (AI), we demand that Apple leadership urgently remove Grok and X from the App Store to prevent further abuse and criminal activity.

xAI says Grok will stop editing images, sort of

While Apple and Google have remained mostly silent since the issue began, prompting harsh criticisms, as well as speculation that they feared angering Elon Musk and even President Trump, xAI confirmed today that it will update the Grok account on X to address the problem, at least in part:

Updates to @Grok Account

We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis. This restriction applies to all users, including paid subscribers.

Additionally, image creation and the ability to edit images via the Grok account on the X platform are now only available to paid subscribers. This adds an extra layer of protection by helping to ensure that individuals who attempt to abuse the Grok account to violate the law or our policies can be held accountable.

Geoblock update

We now geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it’s illegal.

Unsurprisingly, a quick search on Grok’s mentions reveals that multiple X subscribers are already attempting to circumvent the newly imposed restrictions, with some success. At the same time, non-subscribers mostly get the following message:

Image generation and editing are currently limited to verified Premium subscribers. You can subscribe to unlock these features.

9to5Mac’s take

As xAI adjusts the new filters and rules, it remains to be seen whether this will be enough to put this case to rest.

History shows that trolls can be relentlessly creative when trying to circumvent safety restrictions, especially when it comes to attacking women online.

And considering the multiple carve-outs in xAI’s announcement, including the fact that some of the rules apply only to Grok’s account on X, it is more likely than not that this won’t be the end of it, particularly for the victims.

Be that as it may, one thing is certain: it has been severely disappointing to see Apple sit on the problem (at least publicly), hoping it would go away on its own.

With every new nonconsensual bikini and CSAM image widely accessible to minors through X’s iOS app over the last few weeks, Apple not only undermined its go-to argument that it strives to foster a safe environment on the App Store, but also reinforced the (sometimes unfair) notion that it completely lost its spine in recent years.

Accessory deals on Amazon

Add 9to5Mac as a preferred source on Google
Add 9to5Mac as a preferred source on Google

FTC: We use income earning auto affiliate links. More.

Source link

Related