Skip to content

Fake news is scary but deepfakes could be even worse

Trevor feels future face-swapping technology could challenge our notions of what is real
19582038_web1_trev-thoughts-web-column-graphic

Why did it have to be like this? I just wanted to write about pets this week.

Recently someone forwarded me a Mark Critch video in which he’s mocking Donald Trump vis-a-vis the topic of Alberta separation.

I, like most others who reacted online, had the same initial reaction: Critch’s makeup and mannerisms were an eerily perfect imitation of Trump, almost to the point where it seemed implausible.

It was, of course.

Because while Critch was imitating Trump, he was also using a face-swapping app known colloquially as deepfake technology.

LAST WEEK TREV THOUGHTS: One day is just not enough — we need a Remembrance Month

For those of you who haven’t heard of it, imagine an app that lets you superimpose your facial expressions and mannerisms over another person’s physical face.

This is done by taking a photo of a person you want to “fake” and superimposing it over another photo or video of what you want them to be saying or doing.

Using a number of complex coding processes and artificial neural networks (ANNs) — which I’m not even going to attempt to explain, because I can’t begin to wrap my head around how they work — the pixels near the seams of the discrepancies between the two photos/videos are reprocessed to make them less noticeable.

The result is everything from Trump videos like Critch’s (which is indistinguishable from the real thing if not for Critch’s voice and being advertised as a parody) to videos of movie stars altered to say, well, anything someone using the technology can think of.

I like comedy. I would even say I love it — but this is too far. While these videos seem innocuous now (I still have yet to see one of these videos that isn’t obviously faked) they have me terrified for a future in which people truly can’t distinguish reality from fantasy.

That’s the way it’s always been with technology: video games, the internet and television are all perfect examples. Once we have a technology, we constantly improve on it — often exponentially and to the point where its current iteration is indistinguishable from its catalyst.

So while deepfakes won’t threaten us now, or even over the next decade (hopefully), they most certainly are a viable threat, especially when combined with computer-generated imagery (CGI).

When used together, these two technologies could let someone simulate a phony global catastrophe, complete with reactions from global leaders around the world. All pre-recorded, of course, to make the events seem as though they were happening in real time.

Just imagine: hackers find a way to broadcast a forged state of the union address from a future president letting the U.S. know that an extinction-level asteroid is heading towards Earth with the intent of causing mass global chaos, or even just plunging the United States into anarchy.

Even more realistic: the footage is leaked online and characterized as behind-the-scenes footage of the president addressing their inner circle in private, perhaps from some sub-terrain bunker for a dash of dramatic flair.

When you take into account how sensationalized and reactionary media has become, it would only take a few major outlets getting it wrong and running the faked video footage to panic a global population.

A relatively smaller issue (however one that will undoubtedly become a reality sooner) is affecting the outcome of political races via footage of candidates in — ahem — compromising videos.

Perhaps scariest is the fact I can see us falling for it.

READ MORE: You don’t know what you’ve got ‘til its gone

I would not have realized the video was using deepfake technology (or even known what it was beyond my introduction to ANNs via those ridiculous Snapchat filters) if I hadn’t done the research myself.

The vast majority of the people on Critch’s video were commenting things along the lines of “this is the best Trump impression I’ve ever seen, it’s uncanny.”

I wonder how many of them actually took the time to research the video and realize what it was.

For all the talk of fake news and disinformation, we could only be on the cusp of a new age of propaganda which challenges our notions of what is real.

I wish I could say I’m confident we’ll be able to navigate the technology safely, but I’m not.

Perhaps it’s a logical fate for a culture more interested in the 10-year photo challenge than protests going on in the name of democracy and human rights in places such as Hong Kong, Iran and Bolivia.

At least that footage is still real.



trevor.hewitt@interior-news.com

Like us on Facebook and follow us on Twitter