An Artificially Intelligent New Threat to Women

Blackmailing using photoshop has been taken to a new extreme. Disturbingly realistic doctored adult videos are being made with the faces of both celebrities and everyday women.

One woman saw her own face seamlessly grafted, without her knowledge or consent, onto someone else’s body: a young porn actress, just beginning to disrobe for the start of a graphic sex scene. She described feeling nauseous and mortified, “I feel violated – this icky kind of violation,” said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career.

 

Airbrushing and Photoshop have made photos to easy manipulation. Now, videos using ai and animation are becoming just as vulnerable to fakes that look deceptively real.

 

An article from South China Morning Post reported, more on this issue. In their article Deepfake’ porn videos used as weapons against women, they stated, “ Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike “deepfake” videos have quickly multiplied across the internet, blurring the line between truth and lie.

But the videos have also been weaponized disproportionately against women, representing a new and degrading means of humiliation, harassment, and abuse.

The fakes are explicitly detailed, posted on popular porn sites and increasingly challenging to detect.”

The issues and heartbreak that comes out of these videos is never-ending. This can hurt your job prospects, your interpersonal relationships, your reputation, your mental health,” Sarkeesian said. “It’s used as a weapon to silence women, degrade women, show power over women, reducing us to sex objects. This isn’t just a fun-and-games thing. This can destroy lives.”

These videos are also being used as propaganda. One clip went viral for targeting Parkland school shooting survivor and activist, Emma Gonzalez.The clip appeared to be showing Emma alongside three other women, ripping up the US constitution.

GOP activists were quick to share the video as supposed proof of her un-American treachery; in reality, the video showed her ripping up paper targets from a shooting range.

Unfortunately, victims of deep fades have few tools to fight back. Legally this is considered “nonconsensual pornography”, using similar strategies employed against online harassment, cyberstalking and revenge porn.

But experts say “deepfakes” are often too untraceable to investigate and exist in a legal grey area: built on public photos, they are effectively new creations, meaning they could be protected as free speech.

On a slightly better note, websites like Reddit and Pornhub have banned these videos.

For more information read, Graphic ‘deepfake’ porn videos are being weaponized to humiliate women here.