AI — Making Supercharged Superheroes Out of Auditors

By this point, we all know that Artificial Intelligence is set to change the face of business across the globe. In fact, the change is already happening in no small way. But what does this mean for auditors, both internal and external? Two things: first, AI will make superheroes out of auditors; and second, it will supercharge their powers.

Auditors as superheroes

Some organisations have kept AI at arm’s length, in part due to an acute lack of trust within their walls and among their customers — not helped by the current absence of any AI-specific legal framework (which we’ll get to in a second). But as it becomes increasingly difficult to simultaneously maintain distance from AI-powered technologies and remain competitive, auditors will need to swoop in and save the day. And these caped crusaders will do so by providing businesses and their firms with the assurances needed to step into the future free from worry over (and I say this tongue in cheek) “villainous AI”.

Their role in promoting organisational and public trust in AI will only become even more crucial when the EU’s AI Regulation enters into force, which will likely spur many firms into finally hopping aboard the AI bullet train. The EC unveiled its hotly anticipated draft of the rules in April 2021, and it’s estimated they’ll fully come into force sometime between 2024–2026.

For more detail, check out this blog I wrote last year, but all you need to know here is that the AI Regulation is simply the first step in what will hopefully become a massive global effort to mitigate the risks of AI. As organisations start to introduce these technologies, or as they begin to adapt to the new rules, auditors will help leaders develop an in-depth and far-reaching AI risk-management approach.

Auditors will also be an integral part of any firm’s AI governance framework, alongside any internal individuals or bodies tasked with overseeing the creation and application of responsible AI systems. Their independent audits will complement more internal audits to ensure compliance. Some organisations have already started to prepare by developing the necessary capabilities, meaning auditors could also get a head start.

Businesses will also likely have to lean on the services of auditors before the framework enters into force — for example, for any audits, such as data audits, needed as part of the high-risk AI system conformity assessments mandated by the AI Regulation.

Good auditors, even those without much experience in AI, will no doubt be poised for what’s to come — and I imagine they’ll have the necessary skills and knowledge to confidently tackle all of the above and more.

Supercharged auditing powers

Ironically, AI also has a major role to play in transforming the audit process. At EY, we’re turning to AI — alongside other innovations like blockchain, automation and even drone tech — for a better quality, smarter, faster, cheaper, and more rigorous audit process.

We’re embedding AI across our end-to-end audit process, with techniques including machine learning and deep learning allowing us to quickly and accurately: analyse and extract information from unstructured data — such as emails, social media posts, audio files, invoices, and images — to obtain evidence; and evaluate large data sets to identify, assess and respond to the risks of material misstatement due to fraud. This serves to free-up auditors’ time, allowing them to dedicate more hours to tasks that require human scrutiny.

At the moment, the audit area in which AI is most commonly applied is contract reviews. AI tools permit individuals to examine large quantities of contracts at a speed and level of precision far greater than any single human or even substantial team could ever hope to achieve. For instance, it means auditors can automatically extract lease info using pre-selected criteria — such as start date, payment due, termination clause, and so on — before flagging and assessing risks more effectively.

Any Kryptonite to watch out for?

Sure there is, but it mostly relates to technical stuff. Take the ‘black box’ effect (briefly discussed in my last blog about trusted AI), which is often cited as the greatest challenge to auditors. The thing is, as long as organisations make sure their auditors are properly trained to measure and assess AI governance for audits, this should pose no real problem. So, at this stage, there’s really no use in auditors worrying excessively that Kryptonite will foil their “world-saving” mission (again, tongue in cheek)!

--

--

--

Behavioural psychologist; AI-quisitive; EY UK&I Client Technology & Innovation Officer. Views my own & don't represent EY’s position. catrionacampbell.com

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Meet the Personal Stylists Who Are Training Bots to Be Personal Stylists

Apollo 5.0 — Geo-Fenced Autonomous Driving Volume Production

SXSW 2019; teaming up with the machines in an inclusive tech-driven society

AI in Alternative Investments

Here is how we want to combine AI with Stock analysis…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Catriona Campbell

Catriona Campbell

Behavioural psychologist; AI-quisitive; EY UK&I Client Technology & Innovation Officer. Views my own & don't represent EY’s position. catrionacampbell.com

More from Medium

How are your Digital Ethics doing?

The Case for a Global Responsible AI Framework

One of the 2022 AI Trends is Emotional AI (EAI). Turns Out, It’s Already Been In Use For Some Time.

Môveo.AI rocks!