Written by Mark Evilsizor
From his column Tech
All of our lives we’ve heard the sayings: “Seeing is Believing,” and “What You See is What You Get,” but such truisms may be tossed to the wind with the advent of new technology.
Live video footage is persuasive and has the power to communicate in a way that motivates us to take notice and action. From the civil rights march across Selma’s Edmund Pettus Bridge, to the dramatic coverage of the war in Vietnam, video images have changed our world.
Now, however, we have entered an era in which the confluence of several technologies allows for the creation of video that does not reflect reality. Known as Deepfake (DF), it should cause us concern.
In 2021, an actor who appeared to be Bruce Willis appeared in an ad for a Russian cell phone company. The action movie-style commercial appears to show Bruce interacting with another character while tied to a ticking bomb. Except for the fact Bruce was speaking Russian, you would think it was him. But it was not. The “Bruce” of the commercial was a hybrid of video images from “Die Hard” and “The Fifth Element” that were expertly cast onto the head of another actor by skilled production personnel using Artificial Intelligence (AI) algorithms.
Also in 2021, videographer Chris Umé used DF techniques to make a series of videos appearing to show Tom Cruise doing a magic trick and eating candy. They are so convincing most of us would be fooled into thinking we were watching the actor.
While these examples took weeks of effort to create the highest quality possible, basic tools to make DF videos are becoming more available and easier to use, with increasingly convincing results.
One of the risks of the proliferation of this technology is that it provides the same bad guys who currently send phishing emails to folks like you, me, and our employers another method to hack us. The FBI has issued a warning to cyber security professionals about such attacks, referred to as Business Identity Compromise (BIC), saying, “This emerging attack vector will likely have very significant financial and reputational impacts to victim businesses and organizations.”
To defend against this, we would do well to review the fundamentals of computer security at churches and our places of business:
- Educate staff about phishing in various media;
- Ensure multi-factor authentication (MFA) is enabled for access to vital systems; and
- Ensure the approval of multiple persons is required in order to transfer organizational money or pay large invoices, and that this approval occurs over established communication channels.
Another risk of DF is that, when combined with social media, they can be used to quickly spread false narratives of events. Imagine what this could mean for national or world political situations. If we see a video in our social media feed which looks like it confirms our worst fears, we merely need to click twice and our hundred closest friends can view it as well, its narrative strengthened by our endorsement.
Anymore, we do well to take a step back and pause before accepting what we see online as reality. The FBI memo provides great advice for responding to this type of content: “Be alert when consuming information online, particularly when topics are especially divisive or inflammatory; seek multiple, independent sources of information.”
While the technologies and methods of deception have changed over the millennia, the writer of Ecclesiastes reminds us, there is nothing new under the sun. In 1710, Irish satirist Jonathan Swift wrote about the spread of falsehoods, “and it often happens, that if a Lie be believ’d only for an Hour, it has done its Work, and there is no farther occasion for it. Falsehood flies, and the Truth comes limping after it; so that when Men come to be undeceiv’d, it is too late; the Jest is over, and the Tale has had its Effect...”
Mark Evilsizor has worked in Information Technology for more than 25 years. He currently serves as head of IT for the Linda Hall Library in Kansas City, Mo. Opinions expressed are his own.