The Financial Conduct Authority (FCA) regulates financial services firms and financial markets in the UK. Following the UK Government’s publication of its pro-innovation strategy in February of 2024, the FCA introduced an update on its approach to AI. In the FCA’s AI update, the regulator makes clear that they are focused on how firms can safely and responsibly adopt AI as well as understanding what impact AI innovations are having on consumers and markets. “This includes close scrutiny of the systems and processes firms have in place to ensure our regulatory expectations are met,” the update report states.
The FCA as a user of AI
To ensure it has the proper grounding in AI needed to evaluate financial firms’ use of the technology, the FCA has made itself a significant user of AI and contributor to AI advancement in the financial sector. “Continuing to improve how we use data and technology is helping us become a more innovative, assertive and adaptive regulator and will allow us to achieve the strategy laid out in our Data Strategy (2022) towards becoming a digital and data led regulator,” says the AI Update report.
What’s most impressive about the FCA’s use of AI is the establishment of advanced AI infrastructure initiated by, and used by, the regulator, including the following:
The Advanced Analytics Unit. The FCA created its own Advanced Analytics unit that is using AI to develop tools to monitor scam websites. The unit is also using an in-house synthetic data tool for Sanctions Screening Testing which, the FCA says, has transformed its ability to assess firms’ sanctions name screening systems. Regarding sanctions screening specifically, the FCA states in the AI Update, “We have and will continue to work closely with OFSI (The Office of Financial Sanctions Implementation) and other key stakeholders on this key area.”
Digital Sandbox. The FCA’s Digital Sandbox provides GDPR-compliant datasets in a secure environment. Together with over 1,000 API end points, there are 300+ synthetic, public, anonymized and pseudonymized data sets with a scope that covers entities, corporate individuals, consumers, transactions, financial statements, loans, credits and investments. A built-in development environment allows scalable experimentation while safeguarding data assets on the platform.
The Synthetic Data Expert Group. In the report, the FCA recognized that synthetic data has the potential to contribute to beneficial and responsible innovation in financial services and to address financial crime and fraud. The FCA has set up the Synthetic Data Expert Group recently published a report on synthetic data applications in financial services to provide unique insights on the use cases, opportunities and challenges posed by this AI-driven technology.
AI and Digital Hub. In 2020, the Competition and Markets Authority (CMA), Ofcom, the Information Commissioner’s Office (ICO) and the Financial Conduct Authority (FCA) joined forces to form the Digital Regulation Cooperation Forum (DRCF) to work together in regulating online services. The DRCF established the AI and Digital Hub to provide informal advice on how regulation applies to innovation. According to DRCF, innovators can “ask a specific query which spans the regulatory remits of DRCF member regulators…The aim of the Hub is to increase innovators’ confidence in bringing new products, services and business models safely to market, by helping them understand and navigate regulatory requirements. This is a valuable opportunity for innovators to receive free and informal advice from four regulators all in one place.”
Team of data scientists. As of April 2024, the FCA was in the midst of recruiting more than 75 data scientists who will explore how the FCA can use AI to pursue its objectives.
Machine Learning Surveys. Soon, the FCA will run a third edition of its machine learning (ML) survey. You can access the second edition ML Survey here. The FCA runs the surveys jointly with the Bank of England and collaborates with the UK’s Payment Services Regulator (PSR) to consider AI across systems areas. “Being proactive in gaining insights and intelligence on the impact AI is having on UK financial markets allows us to respond to developments with speed and agility,” states the FCA AI Update report.
FCA planned actions around AI for the next 12 months
The FCA states in section 3.5 of the AI Update report that the agency is particularly interested in how AI can help identify more complex types of market abuse that are currently difficult to detect, such as cross-market manipulation. The FCA seeks to improve the accuracy of market abuse detection more generally and ultimately transform market abuse surveillance by incorporating anomaly detection. It goes on to say this in section 4.2: “The FCA is currently involved in diagnostic work on the deployment of AI across UK financial markets.”
Why is the FCA doing all this work around AI?
Many businesses and organizations that fall under the FCA’s regulatory umbrella view the FCA purely as a regulator with a laser focus on ensuring the integrity of the UK financial system and that relevant markets function well. And it is true that the FCA’s three core operational objectives underscore that view. The FCA says as much in the AI Update:
“Our operational objectives are to:
- secure an appropriate degree of protection for consumers
- protect and enhance the integrity of the UK financial system, and
- promote effective competition in the interests of consumers.”
But what many people do not realize is that the FCA also focuses heavily on making the UK economy more internationally competitive. In the AI Update, the FCA states, “We also have a secondary objective to facilitate the international competitiveness of the UK economy and its growth in the medium to long term…we have an important role in the continued success and competitiveness of the UK financial services markets and their contribution to the UK economy. This extends to the role of technology, including AI, in UK financial markets.”
What the AI-supportive FCA means for financial services firms
The FCA characterized its approach to AI in this way: “We support the Government’s pro-innovation approach to AI, including its commitment to fund a pilot of AI & Digital Hub delivered by DRCF member regulators. Our work on innovation and our secondary competitiveness and growth objective provides a clear path for us to foster technological exploration.”
Notice that the FCA bolded the terms “pro-innovation” and “secondary competitiveness and growth objective.” It signals to banks and other financial services firms that they do not need to convince the FCA that AI usage is necessary – only that the firm is leveraging AI in ways that improve the efficacy of AML, sanctions and other anti-FinCrime operations and that they are maintaining a safe environment for consumers.
AI is not new in banking. Compliance programs have been leveraging AI – especially machine learning (ML) and natural language (NL) technologies – with great success since 2016. This is why regulators like FCA, as well as compliance leaders at financial institutions, are both promoting and planning for further widespread adoption of AI-based technologies in 2024 and beyond.
To learn how your organization can benefit from pre-built, AI agents that tirelessly battle FinCrime and align with FCA and other regulators’ requirements around the world, schedule a demo today.