fbpx
Politics Foreign Affairs Culture Fellows Program

Stop It With The Selfies. Really.

With enough images of your face, techies can make hardcore porn starring ... you
Screen Shot 2017-12-11 at 11.40.59 PM

 

A friend of mine and reader of this blog once wrote to tell me about how he and the men in his congregation had to go clean out the apartment of the kindly elderly bachelor who worshiped with them … and found lots of child pornography. They also found thousands of ordinary images of children this creep grabbed off the Internet, from people posting photos of their kids. It shook him up so much that he resolved never, ever to post photos of his kids to social media.

I read a news story last week in which researchers in this field said that parents who post lots of pictures of their kids to social media wouldn’t do it if they knew how those images were used by perverts. I just tried to find that story again, but the Google search for it is too depressing, so I gave up.

Just about everybody I knows posts pictures of their kids to social media, and I rarely tell them about this, because I don’t want to freak them out. But this story finally pushed me over the line:

There’s a video of Gal Gadot having sex with her stepbrother on the internet. But it’s not really Gadot’s body, and it’s barely her own face. It’s an approximation, face-swapped to look like she’s performing in an existing incest-themed porn video.

The video was created with a machine learning algorithm, using easily accessible materials and open-source code that anyone with a working knowledge of deep learning algorithms could put together.

That particular video appears to be the work of only one person, a Redditor who goes by the name “deepfakes.” More:

According to deepfakes—who declined to give his identity to me to avoid public scrutiny—the software is based on multiple open-source libraries, like Keras with TensorFlow backend. To compile the celebrities’ faces, deepfakes said he used Google image search, stock photos, and YouTube videos. Deep learning consists of networks of interconnected nodes that autonomously run computations on input data. In this case, he trained the algorithm on porn videos and Gal Gadot’s face. After enough of this “training,” the nodes arrange themselves to complete a particular task, like convincingly manipulating video on the fly.

Artificial intelligence researcher Alex Champandard told me in an email that a decent, consumer-grade graphics card could process this effect in hours, but a CPU would work just as well, only more slowly, over days.

“This is no longer rocket science,” Champandard said.

The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016. It isn’t difficult to imagine an amateur programmer running their own algorithm to create a sex tape of someone they want to harass.

More:

Deepfakes told me he’s not a professional researcher, just a programmer with an interest in machine learning.

“I just found a clever way to do face-swap,” he said, referring to his algorithm. “With hundreds of face images, I can easily generate millions of distorted images to train the network,” he said. “After that if I feed the network someone else’s face, the network will think it’s just another distorted image and try to make it look like the training face.”

Read the whole thing.

He needs only “hundreds” of face images of a person. How many photos of yourself have you posted on social media? How many photos of your kids? It’s horrifying enough to imagine your own face being used for one of these fake porn videos. Now imagine some scumbag like the guy in my friend’s congregation using the faces of your children.

UPDATE: Wow, read Alexi Sargeant’s prescient and insightful essay about what CGI and the Frankensteining of the Grand Moff Tarkin may mean for art and democracy. Excerpt:

This technology could potentially turn every public figure who’s been in enough available footage into a puppet like Rogue One’s Grand Moff Tarkin. What will happen when anyone can create a convincing video of the president or any other global leader saying anything and share it online? Is it responsible to perfect and popularize this technology?

Fake news purveyors are not the only ones being done a favor by this new development. It’s not hard to imagine pornographers embracing the endless possibilities of CGI recreation. After all, practically every new technology in the world of communication and entertainment is eventually put to pornographic use, and the desires of porn users surely extend to deceased actors and celebrities. It’s the next logical step after crassly commercial uses of the technology, like bringing back Audrey Hepburn to sell chocolate bars in commercials.

Now, the voraciousness of the porn and advertising industries should not frighten us so much that we never develop any new technology. But it should give us caution as we approach ethical edge cases. Not every end-user of a dubious technology like digital resurrection will operate with the respect and propriety of brand-conscious Disney imagineers.

Advertisement

Comments

Want to join the conversation?

Subscribe for as little as $5/mo to start commenting on Rod’s blog.

Join Now