ZISHI's lead on Emerging AI Risks maps current thinking onto FCA’s four recommended focus areas generative AI tools (‘GenAI') - ZISHI

ZISHI’s lead on Emerging AI Risks maps current thinking onto FCA’s four recommended focus areas generative AI tools (‘GenAI’)

Following on from today’s Financial Conduct Authority publication of its update, ‘Reducing and preventing financial crime’ (https://www.fca.org.uk/publications/corporate-documents/reducing-and-preventing-financial-crime), ZISHI’s lead on Emerging AI Risks has mapped our current thinking onto the FCA’s four recommended areas of focus in relation to generative AI tools (‘GenAI’).


Data and Technology: we encourage firms and partners to be bolder and more collaborative in how they engage with new technologies and approaches to keep up with emerging risks.” FCA

The generative AI challenge is multi-dimensional. On the one hand, weak guardrails have allowed criminals to use AI-generated images, video, audio, and text to trick victims into transferring funds, downloading malware, or surrendering personal and financial date. Last weekend’s report into a $25 million AI-enabled fraud in Hong Kong underscores this point.


On the other hand, GenAI, when used ethically, has been demonstrated to extend staff capabilities and accelerate the pace of content creation, although human users’ knowledge of the subject and the ability to parse content for accuracy remain essential. Nevertheless, even this aspect of GenAI has significant downsides, including the risk that AI systems might be breaching the intellectual property rights of human content creators, or that staff using GenAI might upload proprietary or confidential information to the AI platform. Inaccuracies and hallucinations in GenAI products are an additional concern, and so the adoption of this new technology needs to be conducted in a carefully controlled fashion.


Collaboration: criminals will aim to identify and exploit the weakest firms and sectors, so sharing data and intelligence is a vital tool in staying one step ahead…” FCA

As we’ve seen, criminals are using GenAI to target vulnerable firms and individuals. Fake online personas spreading disinformation to send signals to a target market, social engineering scams, and the spread of ransomware, are all being facilitated by these new technological offerings. Data and intelligence sharing initiatives must identify this as a hot topic and include statistics and case studies on the criminal use of GenAI as discrete subject areas in briefings and alerts.


Communication: raising consumer awareness is essential to combatting financial crime…” FCA

Few consumers are sufficiently aware of the power of GenAI to very convincingly fabricate or simulate the faces, voices, or writing styles of colleagues, customers, friends, and family. Raising consumer’s awareness of these risks should be a top priority.

Measuring effectiveness: we need to know that what we are doing is working…” FCA

Phishing and social engineering test campaigns have long formed part of most corporate security programmes. These tests now need to incorporate GenAI attacks at all levels, from email and social media messaging and profiles, to video and voice calls, disinformation tests (to assess gullibility and awareness), and hybrid GenAI attacks that combine two or more of these methods.

GenAI is evolving extremely rapidly. What may today feel like science fiction to most readers is already, in fact, a reality. By tomorrow it will all be old hat and a new range of threats will have emerged.

ZISHI | Resilience Through Knowledge info@thezishi.com

You might also be interested in:

See all courses
  • UPCOMING

    Executive Certificate in Fintech

    £2,359 5 months

    VIEW COURSE
  • UPCOMING

    Executive Certificate in Global Financial Markets

    £2,359 5 months

    VIEW COURSE
  • UPCOMING

    CISI Chartered Wealth Manager

    £2,800 2 days

    VIEW COURSE
  • UPCOMING

    Investment Advice Diploma and Investment Operations Certificate

    £425 2 days

    VIEW COURSE

You need to login first to add to Favourites

My Account