AI can Manipulate Video...Should we be Scared?

 

The screen you’re gazing into is a complex amalgamation of information, compressed, translated, and projected outwards towards your eyes. Despite the unfathomable amount of information smashed together to create what we see on a screen, we tend to view what we see as truth. We obey the authority of the television set, the news anchor, and even suspend our disbelief in the movie theatre when CGI takes characters into fantastical situations.

But the time of trusting our eyes is quickly coming to a close, and we, as a society, find ourselves lacking in the critical thinking skills to parse out reality from artifice.

This is what you want to see

We are moving towards an era where the preferences of the human mind will be read by artificial intelligence. Computers that know what we like, showing us what we want to see, sounds like the culmination of visualization technology. And we’re not that far from that future. Just this year, IBM’s Watson AI was tasked with creating highlight reels from the PGA Masters Tournament based on individual user tastes. Imagine being able to feed hundreds of hours of footage into you computer and it spitting out the parts you wanted to see. Pretty neat.

View at your own risk

However, the contributing factors which enable AI to sort though video footage for our benefit, can also be used to deceive us. The most effective example of AI-assisted deception has been made possible by an open source software called Fake App. The software allows users to train an AI program to recognize and replace specific faces in a video. Early applications have focused mostly on inserting Nicholas Cage into different movies and placing celebrity faces over those of adult video stars.

The results range from uncomfortably accurate to awful, and rely heavily on the user possessing a large volume of video footage of the people they’d like to swap faces between. The final products, even good ones, are also generally easy to spot. Faces flicker and morph. Lighting seems off. Heads are placed on disproportionate bodies. Voices remain unchanged, meaning putting Dwayne Johnson’s face over John Waters’ would lead to an uncharacteristically sarcastic version of The Rock with a tiny body—perhaps with a thin, flickering, moustache. But give this technology enough time, and it might not be so easy for us to tell.

Audible deception

In fact, the audio issue of Fake App has already been solved by other software. Adobe revealed its VoCo software in 2016, which will enable users to create imitative vocal samples using short audio clips. VoCo would read the waveform of the input voice, and create new audio that sounds like the same person. At the time, VoCo needed around 30 minutes of input audio to generate a convincing clip. Today, similar software needs under five seconds.

So we’re quickly coming to a point where the things we watch or listen to in the digital world may not only be unreal, but may also be built with the intent to deceive. Right now, these technologies are available to just about anyone, which means high-quality “deepfakes” may become widespread sooner than we think. Seeing as we’re still riding the vicious wave of text-based misinformation on newsfeeds, if we don’t think ahead, we may find ourselves unable to distinguish the real thing from a concocted one.

We can still beat the machine

The good news is that, just like a deceptive article or photoshop-enabled hoax, the human brain is pretty good at teasing out fact from fiction. No matter how convincing or compelling a video may be, we have the power to think. Logic and common sense usually give us the tools we need to validate a situation with some degree of certainty. Here are some simple interrogations to help get you started:

  1. Is the (video) content unnecessarily short or choppy?

  2. Does the the content make especially outrageous claims?

  3. Are there other sources running the same story?

  4. Does a Google search turn up results suggesting the content might be fake?

  5. Would you believe the events of the content if your least-trustworthy friend told them to you?

Flex your truth-finding corroboration muscle often and stay skeptical. There has never been a shortage of people looking to take advantage of human naïveté—it is only the vehicles that have changed. Don’t let yourself fall victim to the latest in fakery and deception.