Why Deepfake Technology Must Be Banned

On Friday, Samsung revealed that they are now able to create deepfake videos from a single photo. With that came another good reason why the technology must be banned.

Imagining seeing a politician in a viral video saying something absolutely outrageous. In today’s political landscape that shouldn’t be too hard. But in this case, it’s totally out of character for this particular opposition politician. What they are saying is incriminating, morally reprehensible, and an obvious political suicide. What the hell!? You can’t vote for this person now!

The problem with the viral video is that it isn’t real. Even though it looks authentic, it’s a fake video. It’s created by the current political leader’s campaign office, doctored to quickly spread false rumors about the opposition. And it works. They fall like a rock in the polls. Even if the video is later debunked as fake, the damage is irreversible. The video continues to spread like wildfire across the internet. It’s not only popular on niche political sites, but on main stream social media sites as well. And even if they know the video is fake, the main stream sites refuse to remove it.

If you think that this is a thing of a dystopian alternate future, I’m sorry to report that this is the present.

Deepfake

The video described above is fictional, but the technology used to create it is very real. It’s called deepfake, a technique for human image synthesis based on artificial intelligence. Like other AI technology, deepfake is developing very fast.

In 2016, a team of researchers demonstrated how to manipulate a video of someone’s face in real time. A commercial product that could recreate any voice using just one minute of sample audio was made available in 2017. In 2018, these two technologies were brought together to make a fake PSA featuring Barrack Obama. And now, in 2019, Samsung has the technology to create deepfake videos from just a single photo.

At first, I dismissed Samsung’s news as yet another AI hype. Then I realized how fucked we are if deepfake technology is allowed to evolve and thrive. Take a look at this video of Barrack Obama.

All done? Tell me that you saw it was fake before it was revealed, and I’ll call you a blatant liar. BuzzFeed created the video in 2018, and that was the state of deepfake back then. Today, the technology is even better, and it’s even easier to use.

We all realize that this is a weapon, right?

The Fall of Commons Sense

If you believe that main stream social media sites won’t take part in knowingly spreading fake videos, I’m afraid I have to shatter your naive world view. A video of Nancy Polesi, apparently showing her drunk or unwell, recently appeared online. The Polesi video was created without deepfake technology, but traditional editing techniques. Not long after its release, the video was debunked as fake. But Facebook - also known as the cancer of the internet - refuses to remove the video.

What happens when you can no longer trust what you’re seeing and hearing? You get paranoid, and paranoid people get jittery trigger fingers. To be honest, I’m more concerned about deepfake technology than weaponized AI. The former can be regulated and limited to some degree, the latter, not so much.

I’m not usually in favor of regulating or banning technology, but in the case of artificial intelligence, and deepfake technology in particular, I’m all for making exceptions. AI technology and research should be strictly regulated globally, in the same way the weapons industry is. And deepfake technology should banned.

If that doesn’t happen, 2020 and beyond will be a fucking nightmare.


Feedback

This post has no feedback yet.

Do you have any thoughts you want to share? A question, maybe? Or is something in this post just plainly wrong? Then please send an e-mail to vegard at vegard dot net with your input. You can also use any of the other points of contact listed on the About page.


Caution

It looks like you're using Google's Chrome browser, which records everything you do on the internet. Personally identifiable and sensitive information about you is then sold to the highest bidder, making you a part of surveillance capitalism.

The Contra Chrome comic explains why this is bad, and why you should use another browser.