An AI meeting tool can produce a readable summary of a one-hour board meeting in about forty seconds. The output is clean, structured, and often impressive. It will list attendees, headline topics, and a handful of bullet-point actions. For informal catch-ups, this is genuinely useful. For a board, a charity trustee meeting, a pension committee, or a regulated sub-committee, it is something else entirely. It is a draft. And a draft, no matter how polished, is not a record.
That distinction is the entire subject of this post. We build ācta, which uses AI to draft minutes, so we have no interest in a blanket anti-AI argument. The argument we do want to make is narrower and more practical: AI is an excellent tool for accelerating the drafting of minutes. It is a poor substitute for the governance step that turns a draft into a record. Conflating those two things is how organizations end up with files that look like minutes but carry none of the weight.
What AI Summary Tools Actually Produce
If you strip away the marketing, what a typical meeting AI produces falls into one of three shapes.
The first is a transcript: a near-verbatim rendering of the audio, with speaker labels of varying reliability. Useful for search, almost useless as a governance document, because nobody reads a thirty-page transcript to find a decision.
The second is an extractive summary: the model pulls sentences it judges important and stitches them together. The result looks coherent but is heavily shaped by whatever the model decided to keep. Context is lost. Nuance is lost. A crucial qualification on a decision ("subject to the CFO confirming the cash position") can quietly drop out.
The third is a generative summary: the model writes new prose describing what happened. This reads best and is the most dangerous. Hallucinations are rare but real, and they land in your document without flags. Attendees get misattributed. A point raised by one director is recorded as a group consensus. The output is fluent enough that nobody notices until a dispute forces a careful read six months later.
All three outputs share one defining property: no human has yet confirmed them. That is not a detail. It is the entire governance gap.
A Summary Is Not a Record
A summary is a description of what a piece of software thinks happened. A record is a document that a named, accountable person has read, corrected, and confirmed as accurate. Those are different artifacts with different legal, regulatory, and evidential status, even when they contain the same words.
Corporate law on both sides of the Atlantic draws the same line. Under the Companies Act 2006 section 248, every UK company must keep minutes of directors' meetings for at least ten years. Those minutes are, by statute, evidence of the proceedings. Section 249 goes further: a signed minute is evidence that the meeting was duly held and convened, that all proceedings were duly had, and that all appointments at the meeting were valid. In the US, the Revised Model Business Corporation Act (section 16.01) and state corporate codes impose comparable obligations, and failure to maintain proper minutes can expose directors to fiduciary breach claims. A signed minute. Not a file. Not a summary. Not an unreviewed AI output sitting in a shared drive.
The Charity Commission's CC48 guidance on charity meetings and decision-making makes the same point in plainer language. Trustees are expected to ensure that decisions are properly recorded, that minutes are reviewed and agreed, and that the record can be produced if the Commission asks for it. In the US, the IRS imposes equivalent expectations on 501(c)(3) organizations — Form 990 asks explicitly whether the board contemporaneously documents its meetings, and state attorneys general can demand production of board minutes. An AI summary that nobody has formally approved is not a decision record. It is a starting point for one.
The pattern repeats across sectors and jurisdictions. In financial services, the UK's FCA Senior Managers and Certification Regime and the SEC's Rule 17a-4 both expect documented evidence of how decisions were reached — the FCA as proof of individual accountability, the SEC governing precisely how long and in what form broker-dealers must preserve records including committee minutes. On data governance, the ICO's records management code expects organizations to maintain an accurate audit trail of policy decisions; in the US, NARA guidance and sector-specific regulators impose equivalent expectations on covered entities. In each case, the obligation attaches to the confirmed record, not to the provisional draft.
Where AI Genuinely Adds Value
Here is where the argument cuts the other way. The bottleneck in meeting minutes has never been the format. It has been getting the first draft written while memories are fresh, context is intact, and the secretary is not also trying to participate in the conversation they are documenting.
Best practice in most governance frameworks is the 24-hour rule: draft minutes should be distributed within a day of the meeting. Any later and memory erodes, attendees lose the thread, and the document that eventually emerges is a reconstruction rather than a record. Manual minute-taking routinely fails this test. A clerk taking longhand notes while also contributing to a discussion will typically need two or three hours, uninterrupted, the next day to produce a clean draft. That time rarely exists.
This is where AI earns its place. A well-tuned transcription and drafting pipeline can deliver a structured first draft within minutes of the meeting ending. Attendees, agenda items, decisions, and candidate action items are all laid out. The clerk's job shifts from transcribing to reviewing: checking accuracy, filling gaps, correcting attribution, and confirming that the decisions as recorded match what was actually agreed. That is a meaningfully different task, and a much faster one.
AI is at its best when it compresses the time between meeting and draft. It is at its worst when it is treated as the final step rather than the first step.
There is a second, quieter benefit. AI is consistent in a way humans are not. It will not skip the attendance section because it was a short meeting. It will not forget to capture the date of the next meeting. It will not decide, on a tired Friday, that this week's action items are obvious enough to leave out. For a committee that meets monthly and cares about a uniform record across years, that consistency has real value.
The Step That Cannot Be Automated Away
The review and confirmation step is not paperwork. It is the moment at which a document becomes a record. It has three components that each do something AI cannot do.
1. A named human takes responsibility
When the chair signs minutes, they are attesting that the record is accurate. Their signature carries legal weight. If the record is later challenged, the chair is accountable for it. No AI system can carry that accountability, because no AI system has a legal identity, a professional reputation, or a fiduciary duty to the organization.
2. Discrepancies get surfaced and resolved
The review step is where an attendee reads the draft and says, "That's not quite what was agreed. We said six months conditional on sign-off from legal, not six months outright." These corrections are the point. They are the signal that real attention was paid. A document that goes from AI straight to archive, without this friction, preserves whatever errors the model made and buries them under the appearance of formality.
3. The record is locked
Confirmed minutes should not be silently editable. Once approved, the document is fixed. Corrections, if needed, go through an amendment process at the next meeting and are themselves minuted. This discipline is what makes the record defensible years later. An AI output held in a generic document store, where anyone with write access can change a date or an attendee without trace, fails this test the moment you look at it.
Attempts to automate this step away almost always reduce to "trust the model". That is an odd position to take for any decision important enough to be minuted in the first place.
Where Unconfirmed Summaries Fall Short
The weakness of an unconfirmed AI summary becomes concrete when you imagine it being produced in evidence. A few scenarios show the pattern.
- A charity is asked by the Commission to demonstrate how trustees considered a significant related-party transaction. The file produced is an AI summary, never formally approved. The Commission is within its rights to treat it as an unreliable account.
- A listed company faces a shareholder claim that a decision was taken outside proper procedure. Under corporate statutes on both sides of the Atlantic, a signed minute would serve as evidence. An unsigned AI draft is just a piece of prose.
- A social housing provider is audited on its repairs policy. The audit trail shows policy changes discussed at committees, but the minutes are AI outputs with no confirmation step recorded. The auditor flags the weakness; the rating drops.
- An employment tribunal or court examines what was said at a disciplinary panel. The panel used an AI recorder. The unreviewed transcript contains an attribution error the panel chair would have caught. It is now in front of a judge.
- A pension trustee board relies on an AI summary to document an investment decision. Three years later, a beneficiary challenges the decision. The trustees cannot point to a signed minute, only a file modified by an unknown user at an unknown time.
In none of these cases is the underlying AI tool the problem. The problem is the missing step between draft and record. Tools that encourage organizations to skip that step are, in effect, lowering the evidential quality of the record without anyone choosing to do so.
How ācta Uses AI
We built ācta around a specific view of how AI should sit inside a governance workflow. The view is opinionated, and it is informed by watching enough compliance teams try to retrofit AI summary tools into regulated processes to see where the wheels come off.
AI handles the drafting. A recording goes in, a transcript is produced, and a structured draft of the minutes comes out: attendees, agenda items, decisions, action owners, due dates, and deferred items. The draft is ready within minutes, which is what the 24-hour rule requires in practice.
Humans handle the governance. The draft is explicitly a draft. It is routed to the chair or minute-taker for review. Edits are tracked. Attendees can be invited to review and raise change requests before confirmation. Nothing is treated as final until a named user, with appropriate authority, signs it off.
Once confirmed, the record is locked. The confirmed version is hashed, timestamped, and stored in a way that makes silent modification detectable. If a correction is later needed, it goes through a formal amendment, is attributed to a person, and is itself part of the audit trail. The difference between the draft version and the confirmed version is always recoverable. That is what tamper-evidence actually means in practice.
The rule we follow is simple. AI drafts. Humans confirm. The system locks. Each step is accountable, and no step tries to do another's job.
What to Look for in a Meeting Tool
If your organization has governance obligations, a handful of practical questions cut through the marketing on almost any meeting tool.
- Is there an explicit confirmation step, with a named approver, that changes the document's status from draft to record?
- After confirmation, is the record locked against silent modification, with any changes captured in an audit trail?
- Does the tool distinguish between attendees, apologies, and absentees, and does it let a human correct attribution errors before sign-off?
- Can you produce, on demand, evidence of who approved a specific set of minutes and when?
- Are records retained for the period your regulator requires, and can you export them cleanly if you need to change vendors?
- Does the vendor publish how the AI component is used, and does it let you review the full transcript that fed the summary?
A tool that answers yes to all six is doing the job. A tool that answers yes to only the first two or three is a note-taker with ambitions. There is nothing wrong with a note-taker. Just do not ask it to stand in for a governance record.
The Honest Position
The honest position on AI and minutes is that both things can be true at once. AI has genuinely transformed how fast a good draft can be produced, and that is a meaningful advance for any organization that takes its meeting record seriously. It has not changed, and cannot change, the part of the process where a human being accepts responsibility for the accuracy of the record.
If you are running a regulated body, a charity or nonprofit, a pension scheme, a listed company, or any organization where the minutes might one day be read by an auditor, a regulator, or a judge, the test is not whether your tool uses AI. The test is whether your process ends with a confirmed, locked, attributable record. If it does, AI is a force multiplier. If it does not, AI is, at best, a fast way to produce something that looks like a record without being one.
The Workflow the Argument Requires
ācta is built around the sequence this post argues is necessary. The AI handles the draft: a recording goes in, a structured set of minutes comes out, with attendees, decisions, action owners, and deferred items, ready within minutes of the meeting ending. That is where the automation stops.
The rest is governed by a deliberate workflow. The draft is routed to the chair for review, corrections are tracked, and nothing changes status until a named person formally approves it. At that point the confirmed record is sealed with a SHA-256 hash and timestamped, so any subsequent modification is immediately detectable. If a correction is ever needed, it goes through a formal amendment that is itself part of the audit trail. The AI accelerates the work that was always going to happen. The workflow enforces the step that turns that work into a record.
Ask one question of any minutes in your archive. Could you prove, today, that this document is what was approved, unchanged, by the person who approved it? If the answer is no, the tool that produced it is not the one you need.