Skip to content

Experts Caution About Major Vulnerabilities in Tools Safeguarding Artists’ Work from AI Threats

admin

Artists’ tools like Glaze and NightShade, designed to protect against unauthorized AI use, have significant vulnerabilities. These tools apply invisible distortions to artworks to mislead AI during training, but researchers from Cambridge and Darmstadt have developed a method called LightShed that can effectively bypass these protections. LightShed detects and removes the distortions, allowing AI to misuse the artwork without consent. This discovery highlights the inadequacy of current protection methods and emphasizes the need for more robust solutions to safeguard artists’ rights. Additionally, over 1,400 members of the Equity union, representing performers in film and TV, have voiced concerns over inadequate AI protections in their contracts. They demand strong regulations to protect their image and likeness amidst ongoing negotiations with Pact for better contract terms, reflecting a growing concern in the creative industry regarding AI advancements and artist rights.

Source link

Share This Article
Leave a Comment