Jesus' Coming Back

New AI Program With Massive Potential To Reshape Society Claims To Match Porn Actors With Photos On Social Media

Many people will attempt to change their names to hide their identities, especially if they have a questionable past. People in the “adult” industry are known for doing this, especially women, because while some become “famous,” most do not and they want to forget about this part of their lives and move on. Yet a new app may change that, as it claims to be able to match pornography actors to social media photos and the creator of the program claims he has already identified over 100,000 of such people according to a report:

Someone posting on Chinese social network Weibo claims to have used facial recognition to cross-reference women’s photos on social media with faces pulled from videos on adult platforms like Pornhub.

In a Monday post on Weibo, the user, who says he’s based in Germany, claimed to have “successfully identified more than 100,000 young ladies” in the adult industry “on a global scale.”

To be clear, the user has posted no proof that he’s actually been able to do this, and hasn’t published any code, databases, or anything else besides an empty GitLab page to verify this is real. When Motherboard contacted the user over Weibo chat, he said they will release “database schema” and “technical details” next week, and did not comment further.

Still, his post has gone viral in both China on Weibo and in the United States on Twitter after a Stanford political science PhD candidate tweeted them with translations, which Motherboard independently verified. This has led prominent activists and academics to discuss the potential implications of the technology.

According to Weibo posts, the user and some of his programming friends used facial recognition to detect faces in porn content using photos from social platforms. His reasoning for making this program, he wrote, is “to have the right to know on both sides of the marriage.” After public outcry, he later claimed his intention was to allow women, with or without their fiancées, to check if they are on porn sites and to send a copyright takedown request.

“This is horrendous and a pitch-perfect example of how these systems, globally, enable male dominance,” Soraya Chemaly, author of Rage Becomes Her, tweeted on Tuesday about the alleged project. “Surveillance, impersonation, extortion, misinformation all happen to women first and then move to the public sphere, where, once men are affected, it starts to get attention.”

Whether the Weibo user’s claims are trustworthy or not is beside the point, now that experts in feminist studies and machine learning have decried this project as algorithmically-targeted harassment. This kind of program’s existence is both possible and frightening, and has started a conversation around whether such a program would be an ethically or legally responsible use of AI.

Just as we saw with deepfakes, which used AI to swap the faces of female celebrities onto the bodies of porn performers, the use of machine learning to control and extort women’s bodily autonomy demonstrates deep misogyny. It’s a threat that didn’t begin with deepfakes, but certainly reached a public sphere with that technology—although in the years since, women have been left behind in the mainstream narrative, which has focused on the technology’s possible use for disinformation.

Danielle Citron, a professor of law at the University of Maryland who’s studied the aftermath of deepfakes, also tweeted about this new claim on Weibo. “This is a painfully bad idea—surveillance and control of women’s bodies taken to new low,” she wrote.

What he claims to have done is theoretically possible for someone with a decent amount of machine learning and programming knowledge, given enough time and computing power, though it would be a huge effort with no guarantee of quality.

The ability to create a database of faces like this, and deploy facial recognition to target and expose women within it, has been within consumer-level technological reach for some time.

In 2017, Pornhub proudly announced new facial recognition features that it claimed would make it easier for users to find their favorite stars—and, in turn, theoretically easier for abusers or harassers to find their targets. As I wrote at the time:

Even if Pornhub deploys this technology in an ethical way, its existence should be concerning. Such technology is unlikely to stay proprietary for long, and given that some people on the internet make a habit of identifying amateur or unwitting models, the underlying tech could supercharge some of these efforts.

In 2018, online trolls started compiling databases of sex workers, in order to threaten and out them. This harassment campaign had real-life consequences, with some sex workers having their payment processors or social media platforms shut down.

What this Weibo programmer is claiming to have built is a combination of these two ideas: A misogynistic, abusive attempt at controlling women. Whether it’s real or not, it’s representative of the dark paths where machine learning technology—and some of the societal toxicity around it—has taken us. (source)

Trends in the adult industry or the otherwise cultural equivalent are important to follow closely because due to the nature of the vice, it is something that cannot be hidden and gives a close insight into the thoughts, ideas, and values of people. In the Western world, it is a vehicle of social policy that often times indicates where technologies, social movements, and governments are heading towards. The trends supporting pedophilia and other forms of unnatural perversions, the use of artificial intelligence to manufacture false videos as evidence, the transhumanist/neodarwinist concepts of another stage of evolution, and the rise of robotics with artificial intelligence all were either first presented or have been able to be tracked through the adult industry’s movements and the people behind them.

It makes complete sense that artificial intelligence would be able to identify actresses (and actors, but the reality is that it will mostly be actresses because men are the largest consumers of pornography) in real life from a single video or two when put against social media photos is a true “thot apocalypse”. The fact is that many women both consume and in growing numbers are involved in its production. While direct numbers are not available for the latter, it is a known fact that due to the proliferation of “adult” websites and the ease of filming as well as setting up a website, pornography is easy to produce and distribute. That a program could “identify” such women by their photos would “dox them and expose who they are while providing no way to “hide” what becomes for many a sordid secret of the past. Truly, most men don’t want to get married to a woman who allowed herself to be a film prostitute, and if this was to happen, it would destroy many relationships, lives, and even potentially lead some women to suicide.

However, the reason why this story is so important is because it has nothing at all to do with pornography. To the contrary, the pornography is the “selling point” to a larger and more dangerous concept, which is the use of computers and security cameras, or really anybody with a phone and a program, to take a photo of a person and tie it to a man’s entire life history with the press of a single button.

It might sound like a concept out of the “Jason Bourne” film series, but such a use of artificial intelligence with cameras and databases truly provides “nowhere to hide” for anybody. While some may say this is a great idea because it could be used to “catch criminals running from the law,”- which it certainly can -it will be used for abuses beyond anything that either the Stasi, KGB, or any intelligence agency in history could dream of.

Consider that in India, the Hindu nationalist government has been pushing for the Aadhaar, which is the national ID card system that identifies all people in the over one-billion strong nation. It has been admitted, as Shoebat.com has exclusively reported, that all people are having their information, including religion, stored in that database. As we know that the Hindu nationalists want to create a “pure Hindu ethnostate and in the process annihilate Christians, the database is going to be used to hunt down such people, and as the database is connected to personal bank accounts and purchases, it is not unrealistic to say in the future that none could “buy or sell” without it. Such a visual technology does not go that far, but it is one part of such a system because nobody can “hide their face” from it, as all will eventually be able to be identified. Consider that if a person is labeled, either by social condemnation or an order from the law as an “outcast,” such technology could and given human nature, will almost certainly be used to “identify” and “target” these people.

Take for example the righteous opposition to homosexuality that what is now a minority of Christians profess in spite of it being an article of the Christian religion. Considering how it has become a virtual requirement of social acceptance to embrace the LGBT and those who refuse to accept it are considered outcasts and are increasingly being discriminated against in terms of the ability to secure and hold jobs as well as interact safely in society, this technology could be a potent weapon against Christians to identify and persecute them for holding to the Faith as it is taught.

This is just one issue. Imagine taking the example above with the Aadhaar, and putting such a facial recognition program to it. In combination with the “Hindutva” ideology that is taking over India, it would be able to help the nationalists hunt down and murder all of her political, religious, or ideological enemies with ease.

Now in fairness, this “app” identifying faces has not been shown to the public yet. It is curious that it seems to have been designed by a Chinese person living in Germany, as Germany has been at the center of much of the rise in nationalism throughout the world. However, the fact that it is being discussed means that something of its nature is in development, and will come out in the near future, and will have the same effects. It does not matter who designs it or where it is designed. When it does come out, it is not about if the technology has benefits, or even who can escape its influence, but how one will deal living in what will become a world in a prison-like state, for wherever a camera and computer are, there will be the potential for all of the armies of workers and spies in the intelligence services with full force, except now instead of having to follow people and “spy” on them, the very machines built to run society are doing their job for them.

This app is not about finding out if the girl next door is making dirty movies. This is about imprisoning the human race under a web of control from which nobody will be able to escape. The porn is just the cheese meant to entice the mouse into the trap.

Science fiction has come true, and Aldous Huxley’s “Brave New World” is no longer a story, but a living hell on Earth that is being made a reality, and as truth is stranger than fiction, one can only imagine what is coming next.

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More