Shoebat.com Warning Comes True, NY Times Admits That AI-Generated Faces Are A Game Changer
Back in late 2017 and early 2018, I sounded an alarm loud and clear here on Shoebat.com about a very disturbing trend I saw taking place online that I feared would become a major trend that could have horrible effects. This trend was that of “face swapping” using artificial intelligence tools in videos. What you could do is to take a person’s face in a video and “swap” it with another so that the person in the video is no longer what actually is there, but is the person you put in its place.
When I first reported this trend, I said that it began showing up in adult content on message boards, and that while of poor quality and obviously fake when one looks at it, I said that the fact it was so popular and so many people were doing this was a bad sign because it really is not about adult content, but rather the ability to manufacture fake evidence and present it as real, either to imprison the innocent or exonerate the guilty. It is so serious that it can destroy the reliability of what is so far one of the most secure forms of evidence used for decades- that of the video tape -to prosecute crimes, and thus has the potential to open the way to abuse and the formation of a police state that Hitler, Lenin, Mao, and Stalin could only dream about.
Unfortunately, my predictions have come true, as one can witness the progression of grainy AI technology to that being almost indistinguishable from reality, and sometimes not able to be distinguished. This is so serious that even the New York Times now is reporting on this and how computers are being used to created people who do not exist.
The creation of these types of fake images only became possible in recent years thanks to a new type of artificial intelligence called a generative adversarial network. In essence, you feed a computer program a bunch of photos of real people. It studies them and tries to come up with its own photos of people, while another part of the system tries to detect which of those photos are fake.
The back-and-forth makes the end product ever more indistinguishable from the real thing. The portraits in this story were created by The Times using GAN software that was made publicly available by the computer graphics company Nvidia.
Given the pace of improvement, it’s easy to imagine a not-so-distant future in which we are confronted with not just single portraits of fake people but whole collections of them — at a party with fake friends, hanging out with their fake dogs, holding their fake babies. It will become increasingly difficult to tell who is real online and who is a figment of a computer’s imagination.
“When the tech first appeared in 2014, it was bad — it looked like the Sims,” said Camille François, a disinformation researcher whose job is to analyze manipulation of social networks. “It’s a reminder of how quickly the technology can evolve. Detection will only get harder over time.”
Advances in facial fakery have been made possible in part because technology has become so much better at identifying key facial features. You can use your face to unlock your smartphone, or tell your photo software to sort through your thousands of pictures and show you only those of your child. Facial recognition programs are used by law enforcement to identify and arrest criminal suspects (and also by some activists to reveal the identities of police officers who cover their name tags in an attempt to remain anonymous). A company called Clearview AI scraped the web of billions of public photos — casually shared online by everyday users — to create an app capable of recognizing a stranger from just one photo. The technology promises superpowers: the ability to organize and process the world in a way that wasn’t possible before. (source)
To see how serious this is, just go to the website This Person Does Not Exist. When you go there, you will see a photo of a person, and you get a new one each time you reload the screen.
These people do not exist. They are made by AI programs. This image above in the article was generated by that website- this woman does not exist.
Go to the site and look at them, and consider the potential for this. The trend here is so dangerous that one cannot even trust reality any more because the technology is so advanced it can deceive even the most careful of onlookers.
This is a look into the future in many trends, for as our abilities to use our physical senses are taken away, the only thing that will be left is principle, and strict adherence to them, to serve as a guide from distinguishing truth from error, not just in moral issues, but also in issue of public discourse, since it will be difficult to tell who is speaking about any particular topic and what the intentions are.
Comments are closed.