March 22

Would You Recognize Fraud That Uses Deepfake Technology?

Almost every week, I see a new article about the use of deepfake technology, and it’s becoming more refined by the day. Deepfake technology uses artificial intelligence to create or alter video and audio.

There are some interesting new uses for non-criminal purposes, such as during the recent political campaign in South Korea, where one of the candidates used an avatar to encourage people to vote (while also making disparaging remarks about his opponent). See an example: These Campaigns Hope ‘Deepfake’ Candidates Help Get Out the Vote.

Technology is also able to generate high-quality videos from text you supply using artificial intelligence. One company in this space, Synthesia, allows you to create any type of video with a relatively inexpensive subscription. You can see how it works by generating a short demo video on the Synthesia website.

NVIDIA has announced their “Omniverse Avatar,” which will provide the capability to have a conversation with an artificial intelligence assistant. The CEO of NVIDIA, Jensen Huang, used the technology to create a virtual avatar of himself to conduct an “ask me anything” demo.

But there is also a growing trend of using deepfake technology to commit fraud.

Recently, the FBI put out a warning about techno-criminals who use video conferencing systems to impersonate executives in a new form of Business Email Compromise (BEC) attack. The “executive” (whose video is conveniently not working) tells an employee to send a wire transfer to an overseas account during a conference call. The FBI says that some cases involved the use of deepfake audio to impersonate the executive’s voice in order to make the conversation more convincing.

We’ve already seen a case where deepfake audio was used to instruct an employee to wire funds, resulting in a loss of over $243,000 (Unusual CEO Fraud via Deepfake Audio Steals US$243,000 From UK Company).

Research shows that it’s easier to fool most people with fake audio than with video. Several services can already use artificial intelligence to create an audio deepfake of a person’s voice using a relatively small audio sample. For more information, see demos at or Descript.

I talk about deepfake technology and data poisoning in my book Techno-Crimes and the Evolution of Investigations. I provide even more examples of deepfake technology and its use on the bonus resources page that is available with your book purchase.

Deepfake and other technologies are growing exponentially and becoming more sophisticated. Soon you may not be able to tell the difference between what’s real…and what’s fake.

If you don’t have a system to validate instructions for funds transfers or to prevent this type of fraud, follow the FBI’s guidance and create one. If you see a video or hear audio that makes you question its veracity, then take more steps to validate its authenticity before taking any further action.

Learn more about how deepfake technology works, and consider letting your management and co-workers know how it can be used to commit fraud. Also, think about whether your clients might benefit from this information.

If you’re interested in learning more about how technology is being used to commit crimes, join the Techno-Crime Institute mailing list. When you click on this link, you’ll receive immediate access to our short mini-course, where I’ll explain why I believe technology will force investigations to evolve. You’ll also receive updates about new types of techno-crimes, information about security tools and techniques to protect your data, and ways to increase your privacy.

Deepfake technology is only one type of techno-crime, and you need to be aware of all the different ways technology can be used to commit crimes and hide evidence.

Then you’ll be in a better position to develop a strategy for these types of investigations and identify technical experts if you need additional help.

Don’t be left behind. Learn about techno-crimes.

For more information about techno-crimes, join the Techno-Crime Institute Mailing List
by completing the form below.


You may also like

AI And the Dangers of Malware

AI And the Dangers of Malware
{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}