Artificial Intelligence is going to be likely used in the future to manufacture “evidence” of an alleged “crime” in the case of unscrupulous people or groups.
As I have noted before, the propagation of a said technology is often times connected to its use in pornography. In accordance with this pattern, recent development in photo and video editing technologies now allow people to realistically create and superimpose through the application of artificial intelligence naked bodies onto otherwise clothed ones, giving the impression that a person has “nudes” when the reality is not so:
A new AI-powered software tool makes it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes.
The app is called DeepNude and it’s the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole, and is available to download free for Windows, with a premium version that offers better resolution output images available for $99.
Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.” But in the images created by Motherboard, this watermark is easy to remove. (We were unable to test the app ourselves as the servers have apparently been overloaded.)
As we’ve seen with previous examples of deepfake pornography, the quality of the output is varied. It’s certainly not photorealistic, and when examined closely the images are easy to spot as fake. The AI flesh is blurry and pixelated, and the process works best on high-resolution images when the target is already wearing revealing clothes like a swimsuit.
But at lower resolutions — or when seen only briefly — the fake images are easy to mistake for the real thing, and could cause untold damage to individuals’ lives.
Although much of the discussion around the potential harms of deepfakes has centered on political misinformation and propaganda, the use of this technology to target women has been a constant since its creation. Indeed, that was how the tech first spread, with users on Reddit adapting AI research published by academics to create fake celebrity pornography.
A recent report from HuffPost highlighted how being targeted by deepfake pornography and nudes can upend someone’s life. As with revenge porn, these images can be used to shame, harass, intimidate, and silence women. There are forums where men can pay experts to create deepfakes of co-workers, friends, or family members, but tools like DeepNude make it easy to create such images in private, and at the touch of a button.
Notably, the app is not capable of producing nude images of men. As reported by Motherboard, if you feed it a picture of a man it simply adds a vulva.
The creator of the DeepNude app, who identified himself as “Alberto,” told Motherboard that he was inspired by memories of old comic book adverts for “X-ray specs,” which promised they could be used to see through peoples’ clothes. “Like everyone, I was fascinated by the idea that they could really exist and this memory remained,” said Alberto.
He says that he is a “technology enthusiast” rather than a voyeur, and is motivated by curiosity and enthusiasm for AI, as well as a desire to see if he could make an “economic return” from his experiments.
Alberto says he considered the potential for harm caused by this software, but ultimately decided it was not a barrier. “I also said to myself: the technology is ready (within everyone’s reach),” Alberto told Motherboard. “So if someone has bad intentions, having DeepNude doesn’t change much … If I don’t do it, someone else will do it in a year.”
We contacted Alberto to ask further questions, and he replied briefly, saying the app was created for fun and that he hadn’t expected it to become so popular. He again compared the software to Photoshop, saying this can be used to achieve the same results as DeepNude “after half hours of youtube tutorial.” He also added that if people started using the software for malicious purposes “we will quit it for sure.” (We followed up to ask what counted as a bad use-case, and how he would know, but have yet to receive an answer.)
One negative aspect that Alberto does seem to be worried about is the potential legal fallout, with the license agreement for DeepNude claiming that “every picture edited through this software is considered a fake parody,” and that the app is an “entertainment service” that does “not promote sexually explicit images.” This is an absurd claim to make, given the app’s name, how it’s being marketed, and, well, its entire functionality.
Deepfakes do exist in a legal gray area though, with lawyers saying that AI-generated nudes could constitute defamation, but that removing them from the internet would be a possible violation of the First Amendment. An exception to this might be if the technology is used to create nude images of minors — something that DeepNude seems capable of.
Right now, politicians around the world are beginning to consider the potential harms caused by deepfakes, including in the US. But such legislation will be slow to assemble, and the main priority for lawmakers is stopping the spread of political misinformation. Apps like DeepNude are almost certain to proliferate, offering faster and more realistic deepfakes in the years to come. (source, source)
“Nudes” are really not the concern here. Imagine if this same technology was used, for example, to superimpose a “terrorist uniform” on a photo of somebody. Given how it has been consistently shown that terrorism is a central element in western foreign policy use, it could portray somebody- anybody -to be a “terrorist” in order to justify political prosecution and persecution as well as certain foreign policy objectives.
Can one imagine this tool in the hands of a vindictive spouse? An overzealous prosecutor? A skilled blackmailer? The potential for abuse is enormous.
Technology can be a wonderful thing, but it also can be dangerous. It becomes clearer with each day that in spite of the many good things that have come, the very technology meant to set the human race free is being used to place her into a bondage never before seen in world history that even the most horrendous of science fiction stories have been able to barely describe.