The surge of AI-generated images of partly clothed women on Elon Musk’s X platform has ignited discussions about the legality and regulation of such content in the UK. The 2021 Online Safety Act mandates social media platforms to act against intimate image abuse, yet enforcement remains weak. Sharing images without consent, particularly under the Sexual Offences Act, is illegal, though nuances exist—like prompts such as “bikini” may not be covered. Despite the law targeting non-consensual intimate images, recent legislative measures prohibiting nudifying apps are yet to be enforced. Companies like Grok may face scrutiny for not implementing adequate age-checking measures. Additionally, the Internet Watch Foundation highlights concerns over Grok being exploited to create child sexual abuse imagery, which is strictly illegal. UK GDPR grants individuals the right to request the removal of manipulated images, and complaints can be escalated if platforms fail to comply.
Source link
