Deepfake Awareness Training for the Legal Industry

Christopher Walken, or is it?

I am constantly amazed at the energy and creativity cybercriminals put into faking us out. At the core of their efforts: they want to trick us into doing something we probably shouldn’t do, or make us believe something that isn’t true. Increasingly, they are targeting corporate legal departments and law firms with deepfake tactics that trick employees into transferring money or handing over passwords or other harmful activities. As their tactics get more sophisticated, so should your security awareness training. If you want to protect your law firm from the fallout of criminal deepfake activity, you have to educate your people to recognize it when they see it. You need deepfake awareness training for the legal industry.

The devastating potential of deepfakes to harm our society and the legal industry really hit me hard back in 2018 when I listened to a Radiolab podcast called, Breaking News. (I love Radiolab!) This podcast introduced me to the incredibly sophisticated ways that cybercriminals were manipulating text, audio and video files to fake us out. And the podcast looked into the future, showing where the technology was going and how hard it was going to be to figure out what was real and what was fake. Ultimately, breaking down our society’s trust, which is at the heart of pretty much any successful democracy, business venture, political decision, bingo game… you name it. Trust is critical to progress.

So, how do you protect your law firm or corporate legal department from the devastating effects of deepfakes? Educate your employees to recognize them and report them with deepfake awareness training for the legal industry.

Our partner, KnowBe4, has developed and launched a video starring “Christopher Walken” (but is it?) and it discusses ways that criminals manipulate audio and video files, then distribute them to targets (your CFO?) to get them to take action, such as transferring large sums out of your firm’s bank account. (If people can’t figure out if this is Tom Cruise – a very recognizable person – how will they know if your CEO is actually your CEO asking them to transfer money?)

Here’s an excerpt from a recent Forbes article:

A Voice Deepfake Was Used to Scam a CEO Out of $243,000

It’s the first noted instance of an artificial intelligence-generated voice deepfake used in a scam.

Phone scams are nothing new, but the mark usually isn’t an accomplished CEO.

According to a new report in The Wall Street Journal, the CEO of an unnamed UK-based energy firm believed he was on the phone with his boss, the chief executive of the firm’s German parent company, when he followed the orders to immediately transfer €220,000 (approx. $243,000) to the bank account of a Hungarian supplier.

In fact, the voice belonged to a fraudster using AI voice technology to spoof the German chief executive.

Clearly, criminals are already having success with these tactics and the technology is only improving. Simultaneously, we became more reliant on electronic communication during the pandemic, making us even more vulnerable. 

You need to arm your employees and your firm with a strong sense of skepticism and educated ways to sniff out the fakes, with deepfake awareness training for law firms.

New Deepfakes Awareness Training from KnowBe4

This new Deepfakes video from KnowBe4 teaches your employees…

  • What deepfakes are
  • How deepfakes negatively impact trust in others and society in general
  • How to protect yourself and your organization from deepfakes

It also gives them some easy quizzes to see if they have learned how to discern fact from fiction.

Would you like a free demo of this amazing training video from KnowBe4?  As a leading channel partner with KnowBe4, the world’s most popular security awareness training and simulated phishing platform, Savvy Training & Consulting can help you prevent security breaches and cyberattacks on your law firm or corporate legal department.

Book a 15-minute meeting with me on Calendly for a free demo!


Leave a Reply

Contact Us: