APRA’s governance letter wasn’t written for accountants. Read it anyway.
RegulationAPRA's AI governance letter to banks, insurers and super funds could also have implications for accounting firms that have supplier relationships with these institutions.
The managing partner at a mid-tier accounting firm probably read APRA’s April 2026 letter the same way most accounting professionals did: as a banking story. APRA regulates financial institutions. The firm lodges tax returns and audits SMEs. Surely that is someone else’s compliance burden.
It is not. And the mechanism that connects them is worth understanding now - before a regulated client asks the question first.
What the regulator found - and why it matters beyond the regulated sector
On 30 April 2026, Australia’s prudential regulator wrote to every bank, insurer, and superannuation fund in the country. The message: get your AI risk controls in order, or face enforcement action.
What APRA found in its review of Australia’s largest, best-resourced institutions was systematic failure across three dimensions: audit methods designed for static systems applied to AI that learns and degrades continuously; internal risk functions signing off on technology they cannot independently evaluate; and regulated entities treating vendor assurances as a substitute for genuine governance.
These are organisations with dedicated risk teams, board-level audit committees, and governance frameworks most accounting practices would consider aspirational. If they are failing here, the question for your practice is not whether these problems exist. It is whether anyone is looking for them.
“The governance standard your regulated clients are being held to has just moved. Within months, they will ask you to demonstrate yours.”
The cascade mechanism
The transmission path is direct. When APRA-regulated entities cannot demonstrate adequate oversight of their AI vendors, APRA holds the entity responsible. That finding pushes upstream - into supplier contracts, procurement questionnaires, and client engagement terms. This is the documented pattern from every comparable regulatory uplift. When CPS 230 tightened operational risk standards, every supplier in the chain felt it. AI governance is following the same path.
Accounting practices that supply audit, tax, advisory, or management consulting services to APRA-regulated clients are part of that supply chain. Your largest regulated clients will soon
ask for documented evidence of your AI controls. The firms that have built governance architecture before that question arrives will be positioned very differently from those that have not.
A second obligation compounds this. From December 2026, Australia’s updated Privacy Act requires any organisation using AI to make substantially automated decisions affecting individuals to disclose this in its privacy policy. Financial planning practices, SMSF advisors, and firms using AI-assisted risk or research tools are likely in scope. The question of whether your practice is affected deserves a deliberate answer, not an assumption.
Myth: Professional standards already cover this. APES 110 and Tax Practitioners Board guidance are our AI governance framework.
Reality: APES 110 addresses professional conduct. It does not speak to AI model drift, automated output validation, or data provenance - the specific failure modes APRA is now documenting with enforcement authority. The Tax Practitioners Board’s guidance on AI is still emerging. In the absence of sector-specific standards, the governance expectations of your regulated clients become the de facto benchmark. Waiting for the regulator to catch up is not a governance strategy. It is a liability position.
Three questions your practice should be able to answer now
1. What AI is touching client work? - Most mid-tier practices using AI-assisted research platforms, tax software with AI features, or document review tools cannot map this accurately. That is the starting point - not because a regulator demands it today, but because you cannot govern what you have not named. Problems first, platforms second. The inventory is the foundation.
2. What can your vendors actually demonstrate? - Not marketing materials. Documented validation processes, data provenance, incident response protocols, and controls for model drift. A vendor who cannot provide this evidence is a governance liability that your regulated clients may not accept when they conduct their own supplier due diligence.
3. Who is accountable for AI risk in your firm? - Not AI adoption. Not AI enthusiasm. AI risk. In a small practice, that is the principal. In a larger firm, a nominated partner. The name needs to be written down and included in your governance framework - not distributed across whoever seems most interested this month.
The question your clients will ask
APRA has articulated, with enforcement authority, exactly what AI governance failure looks like. The firms that read this as a banking story will revisit that judgement when a regulated client asks for documented evidence of their AI controls.
Governance as enabler, not blocker - but first, governance has to exist. The practices that build it now will not be scrambling when the question arrives from outside.
The question isn’t whether your practice uses AI. It’s whether you can account for it.
By
Want to see more stories from trusted news sources?Make Accountants Daily a preferred news source on Google.