Australian company directors are increasingly using AI tools as part of governance processes, from AI-generated board pack summaries to automated minute-taking and trend analysis.
These tools can improve efficiency and insight. But their adoption raises important legal and governance considerations.
Directors’ duties and accountability
AI is a tool. It may assist directors in performing their role, but it does not reduce accountability for decisions made.
Directors’ duties under Australian law are principles-based. ASIC has indicated those principles apply regardless of the tools used. For example, the duty of care and diligence is likely to require directors to review the full board materials rather than relying solely on AI-generated summaries.
There is no substitute for properly exercising business judgment. While the law does not demand perfection, it does require directors to inform themselves "to the extent they reasonably believe to be appropriate" and to act rationally in forming a view (s180, Corporations Act). Case law consistently emphasises that directors must engage with the substance of the issues before them. Unverified AI summaries alone will not meet that standard.
Directors must also understand the limitations of any AI tools used, including risks of bias, inaccuracy and incomplete outputs, in order to discharge their duties.
Best practice for boards using AI
Boards adopting AI should follow established principles of AI governance and accountability. Key considerations include:
Conscious adoption
Undertake due diligence before implementation. Assess accuracy rates, potential biases and undertake an AI risk assessment tailored to the proposed use case.
Security
Ensure security settings are commensurate with the sensitivity of the information processed. Board packs often contain highly confidential and commercially sensitive material. Data residency and storage arrangements should also be considered where appropriate.
Training and policy
Adopt clear internal AI policies and ensure directors and support staff are trained in their application.
Human-in-the-loop
No board decision should rely on AI outputs without thorough human review.
AI literacy
Directors should have sufficient AI literacy to understand how tools operate, their limitations and associated risks. This will be critical in demonstrating compliance with duties of care and diligence when AI forms part of the decision-making process.
AI can enhance board effectiveness - but only where governance, accountability and oversight remain firmly in human hands.
For more on the specific risks and legal issues associated with AI note-takers, see our related article.
All information on this site is of a general nature only and is not intended to be relied upon as, nor to be a substitute for, specific legal professional advice. No responsibility for the loss occasioned to any person acting on or refraining from action as a result of any material published can be accepted.