In July 2025, Jessica Guistolise, Megan Hurley, and Molly Kelley shared their experiences with CNBC regarding nonconsensual deepfake pornography created by a friend using the AI tool DeepSwap. The incident, which affected over 80 women in Minneapolis, emerged in 2024 when they discovered their Facebook photos had been manipulated into explicit content. This alarming trend highlights the risks associated with “nudify” apps, now easily accessible online and through app stores. Despite the emotional trauma suffered, the women found little legal recourse, as the creator faced no apparent penalties. They are advocating for a bill proposed by Democratic state senator Erin Maye Quade aimed at penalizing tech companies enabling such services in Minnesota. Experts warn that the technology is advancing rapidly, leading to a growing need for legal protections against exploitation. With calls for accountability in the face of an evolving digital landscape, the fight against AI-generated harassment intensifies.
Source link

Share
Read more