GK8 by Galaxy The Emerging Threat of Real-Time Deepfakes in Crypto Cybercrime

The Emerging Threat of Real-Time Deepfakes in Crypto Cybercrime

“Believe your own eyes,” the old saying goes. But maybe not if you work in cryptocurrency. Last year, U.S. citizens alone lost an estimated $9.3 billion to cryptocurrency-related scams and filed nearly 150,000 complaints, according to the FBI. A growing share of these schemes involves malicious use of artificial intelligence, especially deepfake technology. Among AI-driven security threats, real-time deepfakes stand out for their sophistication and difficulty of detection.

Introduction

“Believe your own eyes,” the old saying goes. But maybe not if you work in cryptocurrency.


Last year, U.S. citizens alone lost an estimated $9.3 billion to cryptocurrency-related scams and filed nearly 150,000 complaints, according to the FBI. A growing share of these schemes involves malicious use of artificial intelligence, especially deepfake technology.


Among AI-driven security threats, real-time deepfakes stand out for their sophistication and difficulty of detection. Attackers can now impersonate trusted individuals during live video calls, forging a convincing presence and exploiting human trust at its most vulnerable moment.


In recent months, GK8’s research team has observed a sharp increase in threat actors discussing, selling, and searching for tools or services related to real-time video and voice deepfakes. These services are circulating in darknet forums, Telegram channels, and private marketplaces, enabling less technically skilled actors to carry out sophisticated impersonation attacks.

Screenshots from this year show threat actors on a closed cybercrime forum discussing possible real-time deepfake scams in the crypto ecosystem.

High-Profile Real-Time Deepfake Call Scams and Impersonations

Several high-profile incidents illustrate how real-time deepfakes are being weaponized in the crypto ecosystem:

  • Multinational firm in Hong Kong (February 2024): An unidentified firm lost HK$200 million (US$25.6M) after scammers impersonated the company’s CFO and colleagues during a live video call. Except for the one bona fide worker who was duped into sending the money, all meeting participants were deepfaked replicas of real employees.
  • Polygon (May 2025): Attackers used deepfaked videos of co-founder Sandeep Nailwal and other team members on Zoom. Victims were tricked into installing a fake “voice SDK” that turned out to be malware, enabling asset theft.
  • Manta Network (April 2025): Co-founder Kenny Li encountered a deepfake impersonation during a Zoom call with known contacts. Attackers used a fake video and a malicious “Zoom update” prompt.  Li quickly sensed something was wrong and cut the ruse short.
  • Job Interview Scams (2023–2025): North Korean actors and organized crime groups have utilized deepfakes to impersonate remote job applicants, successfully gaining access to internal systems across multiple sectors.

Changpeng Zhao (CZ), founder of Binance, the world’s largest crypto exchange,  warned in June that real-time deepfakes were being used in attacks against crypto executives, exploiting trust and bypassing traditional video-based verification.

Deepfakes as a Tool of Fraud and Deception

Deepfake technology enables the realistic simulation of faces, voices, and behavior, allowing attackers to generate convincing synthetic identities. This capability is being actively exploited across the cybercrime ecosystem. Our investigations into threat actor forums and Telegram groups show that deepfakes are being used  to:

  • Create promotional scam videos, such as investment scam promotions with celebrities 
Russian threat actor promotes DFaaS  (deepfakes-as-a-service) on a cybercrime forum, showcasing a fake Elon Musk image as an example of past work.
  • Create materials for catfishing (where stolen or fake sexual images are used to impersonate someone and trick victims into sending money or valuables), romance scams, sextortion, and other impersonation scams 
  • Bypass platform onboarding and identity verification checks, such as KYC video submissions
  • Trick targets during live video calls, such as job applications, business meetings, and others
  • Create other malicious applications spotted across threat actor communities

Tried and true trust mechanisms, such as facial recognition, voice matching, or video confirmation, are now vulnerable. As AI continues to evolve, the line between real and synthetic is blurring, forcing organizations to reconsider how they authenticate users, verify partners, and detect fraud.

Deepfake tools are easily within reach of threat actors, whether purchased, rented, or obtained for free. Threat actors can also hire a professional deepfake-as-a-service (DFaaS) business that will provide them with a ready-to-go deepfake according to their requirements.

Headlines from forums where threat actors recommend or search for deepfake tools

Low Cost, High Impact: Deepfakes Within Reach

Reviewing the prices of these services, we found that they vary, but not dramatically. Pre-recorded creative deepfakes can start as low as $10–$15, with costs increasing depending on the complexity of the work and the length of the video. Real-time face-swapping deepfakes, on the other hand, are more expensive but still relatively affordable for motivated threat actors who know where to look. Prices generally start from a few hundred dollars and can reach into the low thousands. Vendors may sell software, subscription services, or one-time deepfake generation.

Examples collected from marketplaces and postings include:

  • $300/month for a single mobile account software subscription
  • $700 for a one-time software license
  • $500 for a  one-time online deepfake verification 
  • Real-time, live face replacement starting at $300

Payments are most commonly made in cryptocurrency through an escrow or guarantor service provided by the marketplace, reducing traceability and helping resolve disputes. The combination of low-to-moderate cost, crypto payments, and marketplace escrow lowers the barrier to entry and helps professionalize the underground deepfake economy.

The Rise of Real-Time Deepfakes and Interactive Scams in Crypto Cybercrime

In 2025, real-time deepfakes are being used more consistently as part of targeted social engineering attacks, particularly in the crypto ecosystem.  Rather than aiming for perfect replication, deepfakes are increasingly used to lend credibility to live scams — in investment pitches, internal meetings, or identity checks — where timing and trust are critical. The strategy has shifted: attackers now prize timely, convincing deception over technical perfection.

Cybercrime marketplaces and forums are filled with postings that highlight the surge in demand:

  • This year, threat actors on known cybercriminal forums have been actively searching for high-quality software capable of replacing faces during live video calls, a functionality likely aimed at impersonating individuals during real-time calls.
  • Other actors actively sought deepfake tools or services to enhance their live interactions with victims, specifically for attacks targeting crypto investors. These use cases suggest a shift from pre-recorded content (e.g., deepfaked YouTube videos) to interactive scams designed to exploit trust in real-time communication.
Job posting by a threat actor seeking scammers with specific skills and crypto industry knowledge
  • One service provider openly advertised “deepfake-as-a-service,” claiming the ability to mimic not only civilians or executives, but even law enforcement figures such as the FBI. 
  • Threat actors operating on a Telegram-based marketplace in Southeast Asia,  a notorious hotbed of “pig butchering” scams, are advertising real-time face and voice-alteration software for calls. Following Telegram’s recent blocking of the related Huione Group marketplace, many of its users migrated to a new Telegram marketplace. The U.S. Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN) has linked Huione Group to North Korean hacker collectives involved in numerous real-time deepfake scams, including cases reported this year and documented in this report. This new platform also provides other services to fraudsters, including technology, data, and money-laundering support. 

These and similar posts reflect more than just a technological trend: they reveal the rise of a professionalized business ecosystem built around real-time deepfakes. Threat actors are not only buying and selling tools – they are recruiting talent with the same expectations as legitimate employers,  such as conscientiousness, industry knowledge, and punctuality. They advertise their services with claims of quality and innovation, compete on pricing, and cultivate reputations. In many ways, this underground economy mirrors the structure and language of the legitimate tech sector – a parallel world where deception is the product, and professionalism is part of the pitch. Far from a passing trend, this market is maturing, expanding globally, and unlikely to disappear.

 Lessons and Action Items

The rise of real-time deepfake scams marks a critical inflection point in cyber-enabled fraud, one where AI-powered deception can now occur in live interactions, impersonating trusted voices and faces with alarming precision. These are not simple phishing attacks – they are coordinated, tailored, and convincing enough to bypass trained professionals.

Organizations, especially in the crypto and financial sectors, must rapidly adapt to this new threat model. It is no longer enough to defend against spoofed emails or fake websites. We now face adversaries with the tools to enter your video calls, impersonate your leadership, and deliver malware through human conversation.

At GK8, we believe addressing this threat requires an evolved defense posture grounded in technical and behavioral resilience. Companies must:

  • Train employees to assume that video and voice identities may be compromised  because faces and voices on screen can be synthetically generated. Before taking any sensitive action, workers should use out-of-band verification methods such as separate channels, known safe contacts, or cryptographic authentication.
  • Educate executives and high-risk staff. Founders, C-level leaders, investor-facing roles, and key developers are top targets. They must be continuously trained to recognize modern deepfake tactics, such as muted video calls, fake update prompts, or calls with “technical issues.”
  • Strengthen internal approval protocols. Mandatory multi-party approval processes should be implemented for all major transactions and access requests, including asset movements, software changes, or privileged access. This means no critical action should rely solely on a single individual’s confirmation.
  • Establish a company policy requiring multi-person review for significant digital asset transactions. Any transaction requests initiated solely via video or audio, which are susceptible to deepfake manipulation, must be re-verified through additional actions.
  • Design custody systems and infrastructure for resilience. Assume that any individual can be socially engineered. Use GK8’s Unlimited multi-party computation (uMPC) key management technology with role separation and quorum approvals, ensuring no single employer can compromise critical systems or assets.
  • Monitor the threat landscape. Track how cybercriminals are adopting new tools and services, such as deepfake-as-a-service platforms or impersonation marketplaces, to refine your detection and response capabilities.
  • Consider a healthy combination of hot, cold, and  theft-proof custody to store your funds; Remember, even the most advanced defenses can be bypassed, which is why we recommend keeping only a small amount of funds in a hot wallet, with the rest secured completely offline in an Impenetrable Vault from GK8.

The real-time deepfakes trend forces a rethinking of digital verification strategies. Traditional security models that rely on visual or voice confirmation during Zoom calls, support chats, or onboarding interviews are rapidly becoming obsolete.

Real-time deepfakes reveal vulnerabilities in traditional identity verification methods and call for new approaches to protecting digital assets. For the crypto industry, using secure and impenetrable custody, implementing multi-step approval processes, and enforcing role separation are essential to reducing risk. Security strategies must be comprehensive, addressing technical and organizational controls to withstand even the most sophisticated social engineering attacks.

Disclosures:

This document has been prepared by GK8, a Galaxy company, solely for informational purposes. It does not constitute an offer to buy or sell, or a solicitation of an offer to buy or sell, any advisory services, securities, futures, options, digital assets, or other financial instruments, nor does it constitute investment, legal, or tax advice.
Any statements or views expressed herein reflect current observations regarding cybersecurity trends and custody architecture and do not guarantee protection against unauthorized access, fraud, or asset loss. References to specific custody models (including MPC and Vault architecture) are illustrative and should not be interpreted as guarantees of performance or security.
Certain information contained in this report, including observations on threat actor tactics and forum activity, has been derived from third-party sources. GK8 and Galaxy Digital Holdings LP (“Galaxy Digital”) do not independently verify such data and make no representations as to its accuracy or completeness.
Galaxy Digital and its affiliates may have financial interests in, or provide services to, entities and protocols discussed in this report. If the value of such assets increases, those entities and/or protocols may benefit, and Galaxy Digital’s service fees may increase accordingly. The views expressed are those of the authors and do not necessarily reflect those of Galaxy Digital, GK8, or their affiliates.
© Copyright Galaxy Digital Inc. 2025. All rights reserved.

Continue reading

GK8 grants institutions secure access to DeFi through integration with MetaMask Institutional

By integrating with MetaMask Institutional, built by Consensys, GK8 enables DAOs to access the institutional rendition of the popular on-chain wallet, designed specifically with the needs of crypto-native companies in mind, from its regulation-ready platform.

Blockchain in Banking – Use Cases and Financial Risks

Blockchain-based transactions promise massive potential implications for the world of global commerce and the financial service industry.