Jesus' Coming Back

Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ Porn; The Unstoppable Rise of Deepfake PORN: Experts Reveal Explosion in AI Sites and Apps that Stitch Faces of ANYONE With Photos Online Onto Naked Bodies, as Victims Tell of Their Horror After Police Say There is NOTHING They Can Do

Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ Porn:

We are barely scratching the surface of the dystopian spike in image-based sexual abuse.

Fourteen-year-old Francesca’s life has changed forever.

An email sent by her high school’s principal to her family on Oct. 20, 2023, notified them that Francesca was one of more than 30 students whose images had been digitally altered to appear as synthetic sexually explicit media — sometimes referred to as “deepfake” pornography.

Speaking to media, Francesca shared how she felt betrayed, saying, “We need to do something about this because it’s not OK, and people are making it seem like it is.” She’s right — something must be done.

The issue of image-based sexual abuse (IBSA) — whether it manifests as nonconsensual AI or “deepfake” content, nonconsensual recording or sharing of explicit imagery, extortion or blackmail based on images, recorded sexual abuse, or its many other manifestations — can feel like something that happens to “others.” It’s a headline we scroll past. It can feel distant from our own lives. But that’s far from the truth.

If anyone has ever taken a video or photo of you and posted it online, even an innocent family photo or professional headshot, that’s all it takes.

You and your loved ones are personally at risk for having your images turned into sexually explicit “synthetic,” “nudified,” or “deepfake” content.

It doesn’t take a tech genius on the dark web to do this, as the code and tools to make this are free on open-source, popular websites like Microsoft’s GitHub and are shared widely online. In fact, GitHub hosts the source code to the software used to create 95 percent of sexual deepfakes despite being notified of the exploitative code by anti-exploitation advocates.

These kinds of images can be created in less time than it takes to brew a cup of coffee.

Even if people don’t know how to create these images, they can openly solicit and pay for others to do so on websites like Reddit, where entire communities exist based on trading and creating nonconsensual explicit material.

And here’s the kicker, these images aren’t some sloppy photoshop of a face onto a body, a la 1990. Top executives at one of the most innovative technology companies in the world have told us that they themselves typically cannot tell if an image is synthetic, artificial pornography or not. There isn’t some easy watermark that separates fake versus real. —>READ MORE HERE

The unstoppable rise of deepfake PORN: Experts reveal explosion in AI sites and apps that stitch faces of ANYONE with photos online onto naked bodies, as victims tell of their horror after police say there is NOTHING they can do:

It was the winter of 2020 when an acquaintance arrived unannounced at Helen Mort’s door telling the mom-of-one he had made a grim discovery.

This man – who Helen has chosen not to identify – had found dozens of graphic images of her plastered on a porn site. Some depicted obscene and violent sex acts. They had been online for over a year.

‘At first it didn’t really compute,’ Helen, now 38, tells DailyMail.com. ‘How could I be on this website? I’d never taken a nude image of myself?’

Then it became clear: She was the victim of a deepfake porn attack.

Her keyboard perpetrator had pilfered images from her social media accounts and used artificial intelligence (AI) and other sophisticated computer software to transpose her face onto the bodies of porn actresses.

Some pictures were so realistic the untrained eye wouldn’t be able to pick them out as fake.

In one, Helen’s face is seen smiling – the original photo had been taken on vacation in Ibiza but was now stitched onto the body of a naked woman down on all fours and being strangled by a man.

‘I felt violated and ashamed,’ she says. ‘I was shouting, “why would somebody do that? What have I done to deserve it?”‘

Next to the vile cache, was a message: ‘This is my blonde girlfriend Helen, I want to see her humiliated, broken and abused.’

However, she says she knew instinctively that her boyfriend, the father of her toddler son, was not to blame.

What she couldn’t work out was why she had been targeted.

‘I thought this was something that only happened to celebrities,’ she says. ‘I’m a nobody. Clearly somebody had a vendetta against me.’

Helen works as poet and part-time university lecturer in creative writing, living in Sheffield, in the UK. —>READ MORE HERE

If you like what you see, please “Like” and/or Follow us on FACEBOOK here, GETTR here, and TWITTER here.

Source

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More