FRC guidance on use of AI in audit marks “landmark” publication
The publication of the Financial Reporting Council’s first ever guidance on the use of AI in audit reflects the rapid evolution of the use of automated tools in the profession

The Financial Reporting Council (FRC) has described its new guidance on the use of artificial intelligence (AI) in audit as a “landmark” publication.
Issued in late June in response to the recent rise in the use of AI among accounting and advisory firms, the guidance reflects the council’s support for innovation, says Mark Babington, FRC Executive Director of Regulatory Standards.
“AI tools are moving beyond experimentation to becoming a reality in certain audit scenarios,” Babington explains.
“We are not against new tools or new ways of working. When deployed responsibly and in an appropriately targeted way, they have potential to enhance audit quality, support market confidence and drive innovation.
“The purpose of this guidance is to help auditors articulate their business case for the use of AI in audit, and link this to the requirement for documentation detailing how these AI tools have been evaluated and deployed in the audit process.”
The guidance outlines a coherent approach to implementing a hypothetical AI-enabled tool, Babington explains, alongside insights into FRC documentation requirements.
“We recognise that this field is moving quickly—we are already seeing the use of some AI tools in the audits we inspect—so it is a good time for us to engage,” says Babington.
“We know that firms using our standards and material want to be able to use AI to query their audit manuals and access technical information and, beyond this, that firms are also using machine learning in core audit activities.
“We are also seeing AI being used for things like board minutes summaries and walk-through testing write-ups. Some firms are using third party AI tools like DataSnipper, which uses optical character recognition to help find and triage information.
“Others are using more established technology providers like MindBridge, an AI-powered financial risk intelligence platform that can help to identify anomalies in the data they work with.”
The FRC guidance illustrates how AI can enhance audit work and clarifies its expectations regarding the relevant documentation.
It is split into two parts, featuring both an illustrative example of the use of an AI-enabled tool to test journals, and principles intended to support proportionate and robust documentation of tools using AI.
“The guidance should support auditors and central teams at audit firms as they develop and use AI tools in their work, while also providing third-party technology providers with the regulatory expectations for their customer base,” Babington says.
“It is comprehensive in scope, but it is not prescriptive and does not introduce new regulatory requirements, instead focusing on supporting innovation while maintaining appropriate standards.”

“WE ARE ALSO SEEING AI BEING USED FOR THINGS LIKE BOARD MINUTES SUMMARIES AND WALK-THROUGH TESTING WRITE-UPS. SOME FIRMS ARE USING THIRD PARTY AI TOOLS LIKE DATASNIPPER, WHICH USES OPTICAL CHARACTER RECOGNITION TO HELP FIND AND TRIAGE INFORMATION”
Thematic review: observations The FRC guidance uses a broad and forward-looking definition of AI, encompassing both traditional machine learning and deep learning models, including generative AI.
An accompanying thematic review summarises insights into the processes and controls used by the six largest firms in the UK to certify automated tools and techniques for use in audits.
“The thematic review includes examples of good practice in these processes, which are fundamental to the delivery of audit quality,” Babbington explains.
The review notes that audit firms are increasingly making use of Automated Tools and Techniques (ATTs) to perform risk assessment procedures and obtain audit evidence.
“We are now seeing increasing use of ATTs in more audit areas,” it states, “With some beginning to incorporate emerging technologies, such as AI.

“The use of ATTs has significant potential to improve audit quality, though this is dependent on the ATTs producing consistently reliable outputs and being used routinely in the intended manner.”
In this context, the FRC notes the importance of an effective certification process to verify the reliability of any ATTs used by firms, and their suitability for use in audits.
The council’s definition of certification broadly aligns to the key stages of a system development lifecycle, capturing:
– Initial planning and needs analysis; – Design and development; – Certifying the ATT for implementation; and – Subsequent maintenance and monitoring.
“Overall, we found that most of the firms we reviewed had well-established processes in place to certify ATTs prior to deployment for use in audits. However, in some cases, these processes were less mature and not supported by documented policies,” Babington says.
FRC’s documentation guidance as a practical support to firms in the sensible and considered use of AI in audit.
AI: good use case “There is a lot of momentum in the AI market currently but using it just for the sake of it will only result in a poor outcome. You can’t simply ‘retrofit’ a new AI tool to your existing audit process without running into trouble,” warns Babington.
“Good use of AI involves identifying a specific problem within your audit process and then thoroughly assessing the relevant, available AI tools to determine if they could offer an appropriate solution.
“You must carefully consider all of the potential risks when using any tool like this. Data security is a big one; another is the potential for automation bias, particularly when the machine learning is ongoing.
“Auditors need to fully grasp how any AI tools they use actually work—if a tool doesn’t work the way you think it does, then your risk is magnified. Your certification process really has to be very robust in order to identify all potential risks and ensure the responsible and appropriate use of AI at all times.”
“Some examples of good practice we observed across the certification process included the use of innovative ways to identify opportunities for using ATTs in audits, guiding audit teams through the ATTs available to them depending on their requirements, and targeting required training to relevant users.
“Some firms are also proactively reviewing their ATTs over time to confirm they remain appropriate for use in audits.”
The thematic review informed the development of the FRC’s documentation guidance as a practical support to firms in the sensible and considered use of AI in audit.
FRC: future AI guidance The FRC plans to publish further guidance for auditors and advisory and accounting firms using AI in the future.
“This guidance is the first we have published on the use of AI in audit, but we are already giving further thought to other areas requiring AI guidance in the future,” says Babington.
“We recognise that this is a fast-moving area and that there will be demand for further guidance in the future. We will continue to engage with our stakeholders and working groups to ensure we can continue to support the AI transition.”