The silent cost of AI efficiency: Part 1
TechnologyWhat accounting firms are losing while they’re winning.
Eighteen months ago, a partner at a Melbourne firm spent $180,000 implementing AI across her compliance and tax workflows. Turnaround times dropped 30 per cent. Error rates fell. Six months later, she declined to renew two graduate positions. The numbers didn’t support them.
I understand that decision completely. What concerns me is what that partner — and hundreds like her across the profession — haven’t yet calculated.
The numbers most firms are reading selectively
Wolters Kluwer’s 2025 Future Ready Accountant report found AI adoption in accounting firms jumped from 9 per cent to 41 per cent in a single year. A Stanford Graduate School of Business study tracking 277 accountants found AI users finalised monthly statements 7.5 days faster.
These are genuine gains. Firms are right to pursue them.
But a separate Stanford study, released in August 2025 and based on ADP payroll records from millions of American workers, found something the productivity dashboards miss entirely. Employment for workers aged 22 to 25 in AI-exposed occupations — including accounting — had declined by 13 per cent since 2022. More experienced workers in those same fields were stable or growing. It is specifically the entry-level positions that are contracting.
Stanford’s researchers identified why: AI excels at replacing codified knowledge — the formal learning from education — but cannot yet replace tacit knowledge built through years of practice. The problem is that codified tasks are precisely where junior accountants learn to think.
CPA Australia’s 2025 Business Technology Report found 19 per cent of Asia-Pacific businesses had already scaled back junior accounting hiring because of AI. CA ANZ data shows enrolments in the Accounting Professional Year program dropped from 7,122 in 2018 to just 340 in 2024. Cutting the graduate positions that remain, into that environment, is a structural bet — not a staffing adjustment.
What that bet is actually wagering
CPA Australia’s guidance on generative AI is unambiguous: firms should use AI to streamline administration, not replace professional judgment. That principle is right. Most firms are executing the first half while inadvertently undermining the conditions required for the second.
Complex professional judgment is not something qualifications confer. It is built through unglamorous work: reconciling accounts that won’t balance, preparing returns that require genuine interpretation, explaining difficult conclusions to a client face-to-face. These are where graduates learn what the numbers actually mean. They are precisely the tasks that AI is now replacing.
Research published in Springer’s AI & Society journal found that AI dependence leads to the erosion of activity awareness, competence maintenance, and output assessment. Research in Cognitive Research: Principles and Implications warned that AI assistants may accelerate skill decay among experts and hinder skill acquisition among learners — effects that may go unrecognised until they are significant.
APES 110 requires accountants to form independent professional judgments. That requirement does not change when the first draft of an analysis is produced by software. But forming that judgment genuinely requires a kind of lived formation that is being quietly optimised away.
The value AI delivers inside a firm today depends on expertise that was built before AI arrived.
The partners currently reviewing AI outputs, the managers validating the work, the seniors handling escalations — they all learned their craft through exactly the work AI is now handling. If the conditions that form that expertise are removed before a deliberate replacement is designed, the next cohort of senior practitioners will have a different foundation. The output will look similar, for a while. The capability beneath it will be different.
Myth vs Reality
Myth: “Our team still does all the real thinking. AI just handles the admin. Nothing important has changed.”
Reality: The boundary between ‘admin’ and ‘thinking’ in accounting is not where that statement assumes. When AI reconciles an account, it performs the pattern recognition that trains a junior to spot anomalies independently. When it flags a compliance issue, it removes the moment where a graduate first learns to identify one without prompting. Efficiency and expertise formation draw from the same source material. Firms that optimise one without protecting the other will discover the trade-off — within five years, not fifty.
The question that carries through both articles
Are the people inside this firm getting better or worse at the work that makes our tools worth having?
I am not arguing that firms should slow AI adoption. The gains are real and the competitive imperative is genuine. What I am arguing is that decisions about where AI is deployed, and what happens to the human time it frees, are not technology decisions. They are practice design decisions — and most firms are treating them as the former while leaving the latter unanswered.
That is not a crisis. It is a design problem. Design problems are solvable, if they are recognised early enough.
The second article in this series builds the practical framework.
Want to see more stories from trusted news sources?Make Accountants Daily a preferred news source on Google.