By Evan Spicer
Director of Cryptocurrency Investigations
Audio and video may seem like a surefire way to ensure that you are listening and talking to an actual person. However, cybercriminals have shown that they can trick their victims by manipulating images and recordings of trusted people.
Deepfakes Attack Binance Executives
One shocking example occurred when Binance Chief Communications Officer Patrick Hillmann received messages from colleagues about a meeting he supposedly led but which he never actually attended.
It was discovered that cybercriminals used Hillman’s hologram and called a meeting of executives without Hillmann’s knowledge. The hackers edited old interviews featuring him to make it appear as if he were holding the meeting.
In the wake of this disturbing news, Hillmann nevertheless maintained his sense of humor and quipped that the executives should have known it was a deepfake because while he had gained 15 pounds due to COVID lockdowns, the videos showed a thinner version of Hillmann from years earlier.
Deepfakes – The Dangerous Identity Crisis
Deepfakes, however, are no laughing matter and are becoming yet another weapon in the arsenal of desperate cyber criminals. Incidents like the deepfake of Binance executives occur daily on many social media platforms such as Instagram, Facebook, Twitter, Telegram, and especially LinkedIN.
Once these cyber criminals convince victims they are who they pretend to be, they offer various fake opportunities. They may tell people they will feature their projects on Binance or other platforms or may claim that their targets can take advantage of deals with huge returns.
Of course, these opportunities and deals are fake, but often victims don’t find out until their funds or data have already been compromised.
How to Stay Safe from Deepfake Frauds
Binance has gotten the memo about the prevalence of deepfake frauds, not surprisingly, since holograms of its executives have been misused. In response, Binance has implemented an advanced tool to verify whether images and identities are real.
Some governments have taken action against the problem of deepfakes by outlawing them, even if they aren’t used for fraudulent purposes. Also, technology, like that used by Binance, is in development to become more accurate in detecting doctored videos and audio.
However, you can often, but not always, tell if a video is fake. If there is uneven skin tone or quality, a jerkiness in the videos, random blinking, or lips that aren’t well-synched with speech, you could be dealing with a deepfake.
One of the simplest ways to determine if a video is real or fake is to verify all information by contacting the person. It may take some time waiting for a response, but this wait can save you a significant amount of money. You could also contact the social media platform to check out the video.
If You’ve Lost Money or Data to a Deepfake Fraud, Contact MyChargeBack
It’s important not to give up if you are a victim of online fraud, but to seek recourse immediately. MyChargeBack has developed working relationships with law enforcement agencies worldwide, has extensive knowledge and experience with crypto tracking, and can improve your prospects of getting your funds back.