AI and board reporting: Reflections from the Star judgment

    Current

    A significant Federal Court judgment offers a timely view on how artificial intelligence (AI) is already intersecting with board practice and what that means for directors.


    In the ASIC v Bekier (Star Entertainment) judgment, Justice Michael Lee described a familiar modern problem: the ever-expanding board pack.

    What once arrived as “a modest bundle of papers designed to assist judgment”, Justice Lee said, now resembles “an electronic publishing project”. Executive summaries sit atop summaries of summaries, followed by detailed papers, annexures and technical appendices. Individually sensible documents, taken together, become “oppressive”.

    The practical consequence is a kind of directorial triage. Directors read what appears central, scan what seems material and assume anything truly alarming would have been highlighted.

    Justice Lee suggested the growth of board packs reflects both director expectations and management psychology. Without discipline around synthesis, preparers are tempted to include everything. The pack ends up doing two jobs: officially informing the board, and unofficially insulating the preparers from criticism.

    The case itself concerned ASIC’s civil penalty proceedings against certain former Star Entertainment directors and officers. Reflecting on the governance issues illustrated by the case – including submissions about information “buried” in voluminous board materials – Justice Lee turned to a question increasingly confronting boards: how emerging technologies might interact with boardroom practice.

    He cited the AICD resource AI Use by Directors and Boards: Early Insights as repaying “close reading”.

    AI as an aid, not a replacement

    Justice Lee acknowledged what the research identifies. “It would be jejune to deny,” he said, “that many individual directors are using AI informally to prepare for meetings.”

    But the legal principle remains unchanged. Directors must take reasonable steps to place themselves in a position to guide and monitor management, taking “a diligent and intelligent interest” in the information available to them and applying an enquiring mind to their responsibilities.

    AI may assist with that task. Used poorly, it may undermine it.

    The distinction – between AI as a substitute for judgement and AI as a tool to sharpen it – lies at the centre of the governance challenge.

    The AICD’s early insights research suggests director use of AI remains limited. Many directors lack access to enterprise AI systems, face restrictive organisational policies or are wary of legal risk. The result is a 'two-speed' dynamic: individual directors experiment informally while collective board adoption lags behind.

    That gap creates an accountability ambiguity. If AI is being used, the Federal Court judgment suggests, boards should govern it deliberately rather than quietly tolerating informal 'shadow' use. Justice Lee acknowledges that a way of addressing information overload, at least in part, could be through the “principled and transparent” use of emergent technology – but it must not displace independent judgement and individual diligence.

    Early-stage director experimentation with AI

    In practice, the technology appears to function as a thinking aid rather than a decision-maker.

    At a recent AICD event on AI in the boardroom, director Tim Trumper described feeding breaking takeover news into an AI model alongside publicly available information about the company and asking what Warren Buffett and Michael Porter might want to know. The resulting questions, he said, “elevated the conversation”.

    Shirley Chowdary, Pro-Chancellor at the University of Sydney, described strategy sessions where AI was used as a real-time challenger – drawing on previous documents to surface gaps in thinking.

    Such experiments remain early-stage. Research in Canada, conducted by event panellist Professor Michael Hartmann, suggests 86 per cent of Canadian directors say AI is not yet part of regular strategy deliberations, and only a small minority receive ongoing AI training. The 2025 Annual Corporate Directors Survey found that two-thirds of boards are not yet incorporating AI into their oversight functions.

    A cautious approach to AI adoption

    Boards face several tensions. Confidentiality and other legal and regulatory concerns remain acute, particularly where generative models operate outside secure environments. Regulation is evolving but incomplete, and digital literacy does not necessarily translate into enthusiasm for adoption. Professor Hartmann said the research shows the most technologically sophisticated boards are also among the most cautious users.

    Even so, boards are encouraged to discuss openly how AI tools can support deliberation while preserving accountability and trust in management.

    Chowdary reinforced the importance of director judgement. “You have to have the human in the loop to actually still use judgement and have those discussions – as opposed to taking [AI] as a source of truth.” On board capability, she said: “Boards live and die as a collective. Board education together will make the difference.”

    Trumper offered a practical starting point he calls the AI Hippocratic oath. “Whatever information you have, keep it sacrosanct. Do no harm. If you do those two things, you're going to be on the right side of most of this.”

    He also reflected on commercial reality. “You don't want management to feel like the board doesn't get this. That's a bad place to be. And probably the reverse is also possible – where management feels like the board is in front of them. You need the boats rising together.”

    Justice Lee's broader observation provides an anchor. He suggests however directors choose to read, summarise or interrogate information, the board's core function remains the same: “The modalities of reading and examining material in board papers might change, but analysing and understanding information provided by management is a core function of a board; after all, this is the primary way by which directors access the information necessary to make informed, bona fide decisions.”

    AI may alter the tools of governance. It does not alter the responsibility.

    Resources

    AI Use by Directors and Boards: Early Insights is a practical AICD resource developed for directors and boards, based on interviews with leading practitioners across Australia. It includes case studies, use cases and a framework for boardroom discussions on AI.

    Related resources include Effective Board Minutes in the Age of AI, developed with the Governance Institute of Australia.

    Access the Federal Court judgment.

    Latest news

    This is of of your complimentary pieces of content

    This is exclusive content.

    You have reached your limit for guest contents. The content you are trying to access is exclusive for AICD members. Please become a member for unlimited access.