Are you concerned about your firm's cybersecurity?

Are you worried about your team member's password use?

bobaguard-suite-logo

With Deepfake Visual Tech, Cybercriminals Can Now Very Convincingly Impersonate People You Trust

There’s an emerging threat to your data security that you’ll likely have to face before long.

It’s called a deepfake attack. 

Sure it makes for a catchy headline when this technology is used to feature an artificial intelligence-generated doppelganger meant to look and sound like Tom Cruise.

However, it doesn’t take a genius to imagine other ways this technology can be used.

A deepfake attack is where a cybercriminal reaches out to you via teleconferencing and impersonates someone you know and trust. The one being impersonated is, more often than not, your boss—or, if you’re the boss, then an outsider whose requests you’d be inclined to go along with.

It sounds crazy, but it’s true. Thanks to artificial intelligence and machine deep-learning techniques, it’s now possible for a cybercriminal to invite you to a Zoom meeting and pose as—oh, let’s say—a judge before whose court you currently have business.

No way, you say. Yet, when you log into Zoom, there on your screen, big as life, is that very judge – ‘in the flesh.’

You know it’s the judge because you’ve seen her face a million times. And up close, too, having stood before her at the bench for sidebars and in her chambers for pre-trial conferences. You could instantly pick her out from a crowd. Ditto the sound of her voice.

But it’s not really the judge. It’s actually some bald guy with a massive scar across his face who goes by the name of Blofeld and lives inside a dormant volcano. He only looks—and sounds—precisely like the judge due to the miracle of deepfake visual technology.

Hard to Say No

Okay, now you’re meeting with a cybercriminal you think is this judge you know.

After exchanging pleasantries, the “judge” tells you she has some terrible news for you. She relays that she received a call from the FBI informing her that several lawyers who, in the last couple of years, appeared before her had recently had their computer systems breached by hackers—and you are one of the victims.

But not to worry, she says, for she has made arrangements with a cybersecurity company to remedy the situation. Further, she promises you that the court is paying for this service—no charge to you.

You’re told the only thing you have to do is produce for this “judge” your system login credentials so she can forward them to the cybersecurity company.

Here’s an honest question for you. Two actually. First, if you were unaware of deepfake technology, would you have any reason to doubt that you’re talking to the actual judge and not to an imposter? Probably not.

Second, given it’s a judge (or so you think): are you going to say no? Again, probably not.

You want to stay on her good side, so you’ll be inclined to pass along those login credentials as requested. You’re also likely to do so because you know the “judge” is trustworthy and acts in your best interests with noble intent.

Now, the truth of the matter is that you weren’t hacked. There was no call from the FBI. There is no cybersecurity company. The whole thing is a made-up story. But you don’t know that you’re being conned. So, you volunteer your login credentials.

And the next thing you know, you’re hacked—for real.

What You Can Do

Fortunately, deepfake attacks are not yet as widespread a threat as other forms of cybercrime. For now, at least, they’re mainly used by pranksters and vengeance-seekers.

But rest assured, it’s only a matter of time before deepfake attackers come for you—or, to be more precise, come for your client data.

How do you defend against this kind of cyberattack? Starting right now, you need to ratchet up your awareness as you interact with people on video calls—and this also applies to people you talk to on voice-only calls.

Understand that floating around in the ether is a vast amount of publicly available information about you, judges, clients, witnesses, colleagues at other law firms, the people at the companies you work with, and practically everybody. Owing to this availability, it’s relatively easy for cybercrooks who want to use deepfake technology to Google biographical details and video and audio samples of whoever they want to impersonate.

Therefore, when you participate in a teleconference, keep a sharp eye out for quirky stuff happening with the video and/or the audio. Deepfake technology is pretty sophisticated, but if you look carefully, you can usually detect when it’s employed. For example, if you see the head of the person on your screen making weird movements, you should immediately suspect that deepfake technology is in play.

Another clue is if the lips don’t exactly sync with the words being spoken—you know, how it looks when you’re watching a foreign-language movie with the actors’ voices dubbed in English.

Also, look for an effect known as video warping. This is where the image you’re seeing becomes distorted. It happens because the digital processing system being used to generate the image doesn’t have enough processing power to keep up and render a totally natural-looking image. For example, you might notice that the onscreen person’s naturally round face suddenly appears slightly rectangular.

Ultimately, you’ll need to adopt new and better rules for authenticating the face on the other side of that display screen. And you should avoid taking any actions—like handing over your login credentials, authorizing payments, or agreeing to download offered documents—until you’re 100 percent certain you’re dealing with who you think you’re dealing with.

If you’d like help equipping yourself and your team to spot deepfake attacks and implement the security measures it’ll take to thwart those cyber assaults, please call or drop me an email. I’m here to help.

Comments are closed.

Want to receive more
Stupid Simple Security Tips?
Sign up below.

FREE CHECKLIST

The Security Checklist for Busy Lawyers

The Security Checklist for Busy Lawyers