Last week, we published a post on biometric security, which uses the features of a person's body as well as their mannerisms and behavior as credentials in computerized security system. Think the facial scans people use to unlock their phones. We touted biometrics as tools any firm should consider using in their security structure, as they're harder to hack than passwords.
But before we move on to the rest of our post, I'm betting that some of you golf fans, right (bear with me)? Did you know that Tom Cruise is an avid golfer? Well, check out this video of Cruise showing off his golf swing and talking about his love of the sport. It's so relatable... except, that's not Tom Cruise. Let me explain...
Deepfakes and Biometrics
If you didn't watch the video, it showed a man, who looked and sounded like Tom Cruise, performing actions in front of a camera. It wasn't actually Tom Cruise, though. It was a computer generated likeness, crafted by an AI that had studied countless images and sound clips of the real Tom Cruise, which it then used to construct a lifelike copy. From there, the AI superimposed the likeness onto an actor's face.
Computer generated likenesses like these are examples of deepfakes. Deepfake technology relies on predictive AI, which examines a large data set of person's appearance, voice and gestures. The AI then uses complex mathematical modeling to create an image based on the previous, real-world data it scanned. As you can see, with enough data, the AI's predictions are often so accurate that they're indistinguishable from the real thing.
Deepfakes can be harmless fun. Naturally, celebrities like Cruise are easy pickings for deepfake AI because they've spent so much time in the public eye, granting the AI a deep pool of data to draw upon.
But deepfakes can also be used to circumvent biometric security. A recent article in The Verge reports on security checks performed by cybersecurity firm Sensity. Sensity works with banks to provide security, and a recent check with ten of their clients found that their security procedures were vulnerable to deepfakes nine out of ten times.
Banks often use something called a liveness test, a form of biometric security in which someone presents their face and then performs a gesture, like turning their heads or changing their facial expression, which signals to the security system that the person being scanned is actually there. Ideally, this would prevent criminals from using photos and other work-arounds to slip past security. But with deepfakes, a criminal could get past a biometric test of this kind with the right know-how.
Protecting Against Deepfake Attacks
If you use biometric security measures, there are some best practices associated with preventing deepfake attacks. Deepfake technology is still in its early stages, so they are clues you can use to determine if an image or video uses deepfake technology. Usually referred to as artifacts, small computing errors, like unnatural eye movements, image warping and poorly defined facial features can all signal to an observer that they're dealing with a fake.
Although there are programs on the market designed to detect deepfakes, a truly thorough security plan will need to integrate manual, human detection and capable computer software. Give Titan Tech a call today to learn more more.
And join us later this week for more tech news.
**Note: if you're interested in gaining a greater knowledge of deepfake technology and the security threat it poses, we recommend reading this write-up from the Department of Homeland Security.**