A terrifying new artificial intelligence app replaces women with one-click porn videos


From the beginning, deepfakes or synthetic media generated by AI have been used primarily to create pornographic images of women, who often find this psychologically devastating. The original creator of Reddit, which promotes technology with the faces of female celebrities replaced by faces in porn videos. To this day, the computing company Sensity AI estimates that between 90% and 95% of all online deep fake videos are non-consensual porn and about 90% of them are women.

With the advancement of technology, very easy-to-use tools without a code have appeared, allowing users to “remove” clothes from women’s bodies into images. Many of these services have since been forced offline, but the code still exists in open source repositories and continues to appear in new forms. The latest such site received more than 6.7 million visits in August, according to researcher Genevieve Oh, who discovered it. It has not yet been taken offline.

There are other apps for changing faces with a single photo, such as ZAO or ReFace, which put users in selected scenes from popular movies or pop videos. But as the first special pornographic app to change faces, Y takes that to a new level. “Made to order” to create pornographic images of people without their consent, says Adam Dodge, founder of EndTAB, a non-profit organization that trains people to abuse activated technology. This makes it easier for creators to improve the technology for this particular case of use and entices people who would not otherwise think about creating deeply fake porn. “Every time you specialize like that, it creates a new corner of the Internet that will attract new users,” says Dodge.

Y is extremely easy to use. After a user uploads a photo of a person, the site opens a library of porn videos. The majority include women, although a small handful also include men, mostly in gay porn. The user can then select each video to generate a preview of the result replaced by a face within seconds — and pay to download the full version.

The results are far from perfect. Many face exchanges are obviously fake, with faces shining and distorting when turned at different angles. But for the casual observer, some are subtle enough to cross, and the trajectory of deep falsifications has already shown how quickly they can become indistinguishable from reality. Some experts argue that the quality of deep falsification is also not particularly important, as the psychological reward for victims may be the same in any case. And many members of the public are unaware that such technology exists, so even poor-quality face-to-face exchanges can mislead people.

To this day, I have never been able to completely download any of the images. It will be there forever. No matter what I do.

Noel Martin, an Australian activist

Y is considered a safe and responsible tool for exploring sexual fantasies. The site’s language encourages users to upload their own faces. But nothing prevents them from uploading other people’s faces, and comments in online forums suggest that users have already done just that.

The consequences for women and girls of such activities can be devastating. On a psychological level, these videos can feel as disturbing as vindictive porn – real intimate videos shot or released without consent. “This kind of abuse – when people misrepresent your identity, name, reputation and change it in such disruptive ways – breaks you down,” said Noel Martin, an Australian activist who has been the subject of a deeply bogus porn campaign.

And the consequences can stay with the victims for a lifetime. Images and videos are difficult to remove from the internet and new material can be created at any time. “It affects your interpersonal relationships; this affects you when finding a job. Every job interview you go to can be picked up. Potential romantic relationships, “says Martin. “To this day, I have never been able to completely download any of the images. It will be there forever. No matter what I do. “

Sometimes it’s even more complicated than porn revenge. Because the content is not genuine, women may doubt whether they deserve to be traumatized and whether they should report it, Dodge said. “If someone is fighting to be a victim, it impairs their ability to recover,” he said.

Non-consensual deep fake porn can also have economic and career consequences. Rana Ayyub, an Indian journalist who fell victim to a deeply fake porn campaign, received such severe online harassment as a result of having to minimize her online presence and thus the public profile needed to do her job. Helen Mort, a British poet and television operator who previously shared her story with the MIT Technology Review, said she was under pressure to do the same after discovering that her photos had been stolen from private social media accounts to create fake naked.

The UK government-funded porn revenge helpline recently received a case from a teacher who lost her job after deeply fake pornographic images of her were spread on social media and brought to the attention of her school, said Sophie Mortimer, who manages the service. “It’s getting worse, not getting better,” Dodge said. “That’s how more women are targeted.”

Y’s option to create deeply fake gay porn, albeit limited, poses an additional threat to men in countries where homosexuality is criminalized, Agger said. This is the case in 71 jurisdictions worldwide, 11 of which punish the crime with death.

Ajder, who has discovered numerous deeply fake porn apps in the last few years, says he tried to contact Y’s hosting service and force it offline. But he is pessimistic about preventing the creation of such tools. Another site has already appeared that seems to be trying to do the same. He believes that banning such content from social media platforms and perhaps even making their creation or consumption illegal would be a more sustainable solution. “This means that these websites are treated in the same way as dark web materials,” he said. “Even if it is buried underground, at least it escapes the eyes of ordinary people.”

Y did not respond to numerous requests for comment in the press email listed on his website. Domain-related registration information is also blocked by the Withheld for Privacy service. On August 17, after MIT Technology Review made a third attempt to contact the creator, the site posted a notice on its homepage that it was no longer available to new users. As of September 12, the notice was still there.



Source link

Leave a Reply

Your email address will not be published.