UKGC Warns of Emerging AI Risks and Crash Games

23RD APRIL 2025

The UK Gambling Commission has identified a new wave of money laundering and terrorist financing threats facing operators this month, including AI-generated deepfakes and high-risk crypto-based crash games.

Regulatory Update: The UKGC’s Warning

The UK Gambling Commission (UKGC) recently identified two major risk areas in its updated AML guidance.

First, the UKGC has flagged emerging money laundering threats associated with AI. In a striking update to its anti-money laundering (AML) guidance, the Commission cited the use of AI-generated deepfakes in identity fraud as a rising risk.

The UKGC stresses that operators need to be aware of how AI and deepfakes can be used to bypass Know Your Customer (KYC) and AML checks. This includes altered identity documents, synthetic videos for identity verification, and manipulated communications that mask a user's true identity, intent or location.

This is particularly concerning as platforms increasingly rely on automated verification tools and services to facilitate remote onboarding. Compliance teams must now contend with the possibility that even high-definition video or biometric verification may no longer be sufficient on their own.

Second, the Commission raised concerns about crash games - fast-paced, AI-powered products that blend gaming mechanics with gambling - and their potential appeal to vulnerable groups, particularly younger demographics.

The updated guidance states: Typically the mechanics of the games mean that, once the initial bet is made, the round begins with a starting multiplier, which grows as the game progresses. Customers have the option to cash out at any point, but if the game crashes before a customer has cashed out, they will lose the money from the multiplier as well as their stake. 

While these games have become popular for their social and interactive appeal, they also present unique challenges in terms of player protection and transparency.

Operators must ensure that game mechanics are clearly understood by players, that outcomes are provably fair, and that AI is not used to manipulate in-game experiences or predict player behaviour in ways that might lead to exploitative practices.

What Gambling Operators and Software Providers Should Do

Here are some things to consider:

  1. Reassess AI Use Cases - Review every current AI deployment from a compliance standpoint. Are your systems built to withstand manipulation by advanced AI threats? AML/CFT Technology Risk Assessments should consider both the advantages and risks associated with use of AI.
  2. Enhance Due Diligence Processes - Strengthen KYC and AML procedures to account for AI-enabled fraud. This may involve layering traditional checks with new methods like device fingerprinting and behavioural analytics.
  3. Audit Game Mechanics - Particularly in high-risk game formats like crash games, ensure transparency, fairness, and responsible design.
  4. Train Your Teams - Your compliance, tech, and fraud teams need regular training on the risks and signals of AI-enabled threats.
  5. Engage With Regulators - Maintain open lines of communication with the UKGC and other regulators to stay informed about expectations and guidance.

AI is not going away - it will only become more integral to how our industry operates - but its unchecked use can introduce significant compliance risks.

Learn More

At Amber Gaming, we understand what it’s like to carry the weight of operating in such a well-regulated, fast-moving industry. If you would like to receive further guidance, support or training on how to manage AI in compliance, feel free to get in touch with our expert team via our Contact Us page.