![]() This includes advertising on mainstream social media platforms, influencer marketing, deploying customer referral schemes, and the use of online payment technologies,” Forbes cited the report as saying. “Bolstered by these AI services, synthetic NCII providers now operate as a fully-fledged online industry, leveraging many of the same marketing tactics and monetisation tools as established e-commerce companies. The availability of multiple open source diffusion models has also made it simpler to alter images, leading to the launch of these “undressing” websites and apps, Graphika’s report said. From celebrities to a layperson, deepfakes can be deployed to target anyone with just a few clicks. ![]() With AI tools becoming more accessible, it has also become easier and cheaper to produce non-consensual sexually explicit content. “It’s abuse when someone takes something from another person that has not been freely given to them,” Sanfilippo added. There is no ability to give consent there.” ![]() She said that for the victim “seeing images of yourself - or images that are falsified to look like you, in acts that you might find reprehensible, scary or that would be only for your personal life can be very destabilising - even traumatising. Psychotherapist Lisa Sanfilippo, whose expertise includes sexual trauma, told Business Insider earlier that creating false nude images “is a major violation” of people’s privacy that can bring intense trauma to the victim. “You see it among high school children and people who are in college.” Speaking to Bloomberg, Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, said: “We are seeing more and more of this being done by ordinary people with ordinary targets”. Pixabay (Representational Image)Įarlier this year, a teenage boy allegedly used AI apps to create deepfake images of female students at a high school in Seattle, Washington, as per the Daily Mail report.Īccording to a Vice report in 2019, a software called DeepNude was used to produce a convincing nude image of women in 30 seconds. The menace of deepfake pornography mostly targets women. These fake nude images made using the fully clothed pictures posted on these girls’ Instagram accounts were then shared in WhatsApp groups. In September, more than 20 girls fell prey to deepfake photos generated by the AI-powered app ‘Clothoff’ which lets users to ‘undress girls for free’, reported Daily Mail. Last month, the deepfake images of female students of a New Jersey high school were circulated online. ![]() The menace is all the more concerning as it is also being used to target minors. Websites that create deepfake nudes – digitally manipulated images to make someone appear naked – have surfaced in the past few years. The use of AI for generating non-consensual pornography has been a worrying trend for years now. The report mentioned that 53 Telegram groups used to access these services have at least 1 million users.ĪLSO READ: How big is the deepfake problem in India and beyond The researchers found that the number of links advertising undressing apps rose over 2,400 per cent since the beginning of this year on social media, including on X and Reddit, reported Bloomberg.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |