Almost exactly one year ago in January 2018, I said that artificial intelligence facial manipulation technology was going to be a major trend to watch for in the future. I said this because I noted that the technology, which in the public sense existed at the time only in small artistic and computer programming circles, had a breakout of popularity among the adult “entertainment” industry where people were swapping the faces of their favorite actors and actresses onto the bodies of film prostitutes, and that this was a sign that it was now formally being accepted by the public and would be an important trend to watch for the future, especially since its major use would be for military and political applications:
Since sex is such an effective tool for manipulating people and society, it is of more importance to pay attention to the sex robots as well as different forms of A.I., because there is seldom a more effective way of selling an idea to the public if it means being able for them to live out very deep, personal fantasies that would be difficult or impossible for them to do so otherwise. (source)
This warning has been proven to be true, and was shown recently with the creation of “Jennifer Buscemi,” which is a hybrid of the face of actor Steve Buscemi onto the body of Jennifer Lawrence in a near seamless creation that one cannot distinguish from a regular human being:
While the nation grapples with concerns over the spread of inaccurate and deceptive information online, deepfakes—videos in which an individual’s face is superimposed onto another—continue to advance at a quickening pace.
Now, the new ones are downright terrifying.
What are deepfakes?
The technology, which relies on machine learning and artificial intelligence, was once largely relegated to researchers at prestigious universities. But over the past few years, a growing online community has democratized the practice, bringing powerful and easy-to-use tools to the masses.
One of the public’s first introductions to deepfakes came in late 2017. A Reddit group devoted to placing the faces of prominent female actresses onto those of porn stars gained attention.
As reported by Motherboard’s Samantha Cole, members of the now-banned subreddit explained how they would first gather stock photos and videos of celebrities such as Hollywood star Scarlett Johansson. That media content would then be fed into specialized, open-source tools and combined with graphic adult content.
The quality of deepfakes are based on several factors but rely heavily on practice, time, and the source material they are derived from. Initially, deepfakes were more shocking than convincing, but readily available programs and tutorials continue to lower the bar for new creators.
One such video, posted by Reddit user VillainGuy earlier this month, has highlighted how far the technology has come. That video—which combines actor Steve Buscemi with actress Jennifer Lawrence at the 2016 Golden Globe awards—is turning heads. Not because anyone believes it is real, but because of the video’s implications.
Utilizing a free tool known as “faceswap,” VillainGuy proceeded to train the AI with high-quality media content of Buscemi. With the aid of a high-end graphics card and processor, “Jennifer Lawrence-Buscemi” was born. VillainGuy says the level of detail was achieved thanks to hours of coding and programming as well.
The video’s viral spread online Tuesday comes as numerous U.S. lawmakers sound the alarm over the potential of deepfakes to disrupt the 2020 election. A report from CNN indicates that the Department of Defense has begun commissioning researchers to find ways to detect when a video has been altered.
Late last year, Rep. Adam Schiff (D-Calif.) and other members of the House of Representatives wrote a letter to Director of National Intelligence Dan Coates to raise concerns over the possible use of the technology by foreign adversaries.
“As deep fake technology becomes more advanced and more accessible, it could pose a threat to United States public discourse and national security, with broad and concerning implications for offensive active measures campaigns targeting the United States,” the letter stated.
Researchers have already developed some methods for detecting deepfakes. One technique, which is said to have a 95 percent success rate in catching altered videos, relies on analyzing how often an individual in a video blinks.
“Healthy adult humans blink somewhere between every 2 and 10 seconds, and a single blink takes between one-tenth and four-tenths of a second,” Siwei Lyu, Associate Professor of Computer Science from the University at Albany, wrote in Fast Company last year. “That’s what would be normal to see in a video of a person talking. But it’s not what happens in many deepfake videos.”
Deepfakes, unfortunately, will only become harder to catch as time progresses. Lyu notes that the race between those generating and those detecting fake videos will only intensify in the coming years.
While lawmakers have focused heavily on the potential national security ramifications of deepfakes, some experts remain skeptical. Thomas Rid, professor of strategic studies at Johns Hopkins University’s School of Advanced International Studies, remarked on Twitter this month that fake news and conspiracy theories already thrive based on far less than altered videos. Rid, an expert on the history of disinformation, argues, however, that deepfakes could lead some to deny legitimate information based entirely on the fact that such technology exists.
“The most concerning aspect is, *possibly*, ‘deep denials,’ the ability to dispute previously uncontested evidence, even when the denial flies in the face of forensic artifacts,” Rid wrote.
Although fears concerning deepfakes and subversion from malicious foreign actors draw attention in the nation’s capital, fake videos could potentially cause much more damage to individuals. Granted, a fake video of a politician engaged in some sort of devious behavior could spread rapidly online before being debunked. But if a similar altered video is used to blackmail a vulnerable person, it’s likely no credible fact-checkers will be there to put out the fire.
The practice of targeting ordinary women with fabricated videos has already begun. In one such example, a woman in her 40s told the Washington Post that just last year someone had used photos from her social media accounts to create and spread a fake sexual video of her online.
“I feel violated—this icky kind of violation,” the woman said. “It’s this weird feeling, like you want to tear everything off the internet. But you know you can’t.”
And those unwilling to take the time to learn how to develop their own deepfakes can simply pay to have it done for them. A now-banned community on Reddit known as “r/deepfakeservice” was found to be selling such content in early 2018 to anyone willing to provide at least two minutes of source video.
Obviously, no one thinks Steve Buscemi and Jennifer Lawrence morphed together at the Golden Globes. But videos based on more believable premises with even higher quality are coming, and the damage they do will depend on how we react. (source, source)
As I have pointed out before, the ultimate end of this technology is not the satisfaction of one’s personal lusts, but of the ability of a government or corporation to manufacture videos and photos that undermine the viability of photo and video evidence itself so to justify social or political actions for a specific end. It is the destruction of one of the most long-accepted forms of evidence, that of the captured image, turning it into a weapon against the very people it has historically helped to vindicate or condemn.
The above video was taken LAST YEAR, which I included in my article. If this technology was making its public debut then, and looking at the above case of “Jennifer Buscemi” now, one can only imagine what other developments already exist that one has not seen yet, and what the future may hold.
As I said last year and I will say again, WATCH THIS TECHNOLOGY, because how it develops will shape the future of governmental and political relations around the world.
This is but one stage in the implementation of not the “hard tyranny” that existed in the Slavic and Eurasian world under the international socialists, known as communists, but is a “soft” tyranny with the comforts of legality, modern life, and a national and socialist consensus that is no less ruthless than its 20th century predecessor. It is Huxlean as opposed to Orwellian in its approach, and it cannot even be said to be “coming” or is something to warn about, for it is already here, it has been accepted, and is naturally growing into its place in society that people will recognize without giving in time so much as a care as to how it came here.
The frog has already boiled in the pot and it never noticed so much that the heat was on.