World Politics

Deepfakes and gender based violence

Edmonton activist protests climate crisis with demonstration in AB legislature

In 2017, journalist Samantha Cole discovered someone on Reddit who was using open-source artificial intelligence (AI) technology to create homemade, non-consensual pornography and sexual images using the faces of celebrities. This person called themselves, “deepfake.”

Early deepfakes were easy to spot because they were glitchy and looked unrealistic. However, this technology has become so sophisticated that anyone with a half decent understanding of computers and AI and with access to a decent computer can now easily make, distribute and sell a decent deepfake. 

All a deepfaker needs to do is find photographs of the person they want to target on a public source like Instagram to create very realistic sexualized images or pornography. 

“There are real sexual autonomy questions and harms that come with just the creation,” Suzie Dunn, assistant professor at Dalhousie’s Schulich School of Law, told rabble.ca during an interview.

Dunn went on to say, “Now, you can have people who can really sexually violate people in pretty serious ways without actually even having to have contact with them.”

A new form of gender based violence

The creation of a sexualized deepfake, in and of itself, is a violation of sexual autonomy – the right of an individual to make decisions about their own body and sexuality without interference. 

Publishing deepfakes online while claiming they are real sexual content is tantamount to non-consensual distribution of intimate images. That’s because the resulting public harm is the same.

The Sensity AI report, The State of Deepfakes 2019 Landscape, Threats, and Impact, found 96 per cent of deepfakes are used to create non-consensual sexual content. Of those deepfakes, 99 per cent were images of women.

This is the newest form of gender-based violence.

“In Canada, and globally, the harms of sexual deepfakes are being recognized. When these deepfakes started coming out of that Reddit site, a lot of people were posting them on Pornhub and different places. Quite immediately, most social media companies, including Pornhub, created policy that said that type of content is not allowed and we include it under the same category as other non-consensual image rules that you can’t post non-consensual content on our websites,” Dunn said.

Australian Noel Martin was targeted by someone who found her on the internet. They began making fake photoshop porn and eventually deepfakes of Martin. 

Martin advocated for legal changes…

Click Here to Read the Full Original Article at rabble.ca…