According to recent research findings, apps used to undress women have risen in popularity. Known as “nudify” apps, these free applications use AI technology to manipulate images, creating explicit representations of individuals without their consent.
Users can take photos of women in public or scour them from social media without permission and turn the photos of the unsuspecting target into pornography to be sold or used for personal pleasure. The disconcerting surge of these services is evidenced by a 2,400% increase in advertising links to these particular apps on platforms like X and Reddit since the beginning of the year, according to Graphika, a social network analysis company.
Presently, there are no existing federal laws in the United States addressing the generation and dissemination of non-consensual deepfake pornography. We are resolute in our commitment to alter this situation, advocating for the establishment of a federal law that deems non-consensual deepfake porn as illegal. Additionally, we urge modifications to Section 230 of the Communications Decency Act, a provision that shields online platforms from liability concerning user-generated content. This particular online framework has facilitated the flourishing of a business centered around the creation and exchange of non-consensual deepfake pornography.
Author Profile
Latest Entries
- LifestyleFebruary 3, 2024Love at First Sight? Viral Tik Tok video describes how couple fell in love in 3 days
- EntertainmentJanuary 26, 2024Britney fans catapult 2011 hit to No. 1 to overshadow Justin Timberlake’s comeback”
- Abuse Is Not a SecretJanuary 11, 2024Mother sentenced to prison for enabling her daughter’s sexual abuse
- World NewsJanuary 11, 2024South Africa accuses Israel of ‘genocide’ at UN high court