Demo

The Civil Justice Council recommends mandatory declarations confirming AI was not used in preparing witness statements for trial, aiming to preserve testimonial integrity amid growing AI adoption in legal practice.

The Civil Justice Council has proposed that solicitors and barristers should be required to confirm they did not use artificial intelligence to generate the content of witness statements intended for trial, while stopping short of mandating AI disclosures for most other court documents so long as a lawyer accepts professional responsibility for them. According to the council’s interim report and consultation paper, the measure aims to preserve the truthfulness and personal voice of witness testimony while allowing legal teams to continue exploiting AI for research, drafting and administrative tasks where appropriate.

The working group, chaired by Sir Colin Birss, says large language models have already reshaped legal practice but carry well-documented risks including hallucination and the embedding of training-data biases. It therefore recommends a targeted rule: trial witness statements prepared in the context of civil procedure should carry a declaration that AI was not used to create, embellish, rephrase or otherwise alter the witness’s evidence, except for non-text-generating aids such as transcription. The proposal is presented as part of a wider consultation on whether new rules are required for pleadings, skeleton arguments and expert material.

The group argues that, for non-trial documents such as statements of case and skeleton arguments, existing obligations that a named legal representative takes professional responsibility should be sufficient to address concerns about AI-assisted drafting; those jurisdictions that already require AI declarations for such documents are noted but the council says blanket declarations are unnecessary if responsibility is clear. The consultation invites views on where disclosure should be required and whether current professional duties provide adequate safeguards.

The paper also points to comparable international practice and recent judicial guidance overseas that restrict the use of generative AI in affidavits and witness statements, citing concerns that automated drafting risks diluting or embellishing a deponent’s own account. The working group highlights the approach taken in other common law jurisdictions and suggests those precedents help explain why stricter limits are appropriate for evidence admitted at trial.

On expert evidence the council proposes that experts should confirm in their statement of truth that any AI assistance has been identified and explained, apart from administrative uses such as transcription. The working group says that transparency about analytical tools used in expert reports would help the court assess reliability without banning use outright. Industry commentary has already emphasised the need to balance innovation with safeguards so courts can evaluate the provenance and limits of algorithmic outputs.

The consultation makes no new proposals on disclosure more generally, observing that AI-assisted review and analytics are long established in disclosure practice and that parties appear to be cooperating over their use. It does flag the particular difficulty of litigants in person, recognising both the access-to-justice benefits of AI tools and the risk that unregulated use could introduce inaccurate or fictitious material into proceedings; the working group says regulation of that area falls beyond its current remit but merits further study.

If adopted, the council’s recommendations would leave courts and professional regulators to translate principles into practice, with practitioners likely to see new drafting checklists or prescribed declarations for trial witness statements and clearer expectations for expert reports. Templates and precedents already used by practitioners for witness and expert declarations may need updating to reflect any final rules emerging from the consultation.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

  • Paragraph 1: [2], [3]
  • Paragraph 2: [2]
  • Paragraph 3: [3]
  • Paragraph 4: [4]
  • Paragraph 5: [3]
  • Paragraph 6: [2]
  • Paragraph 7: [6]

Source: Noah Wire Services

Noah Fact Check Pro

The draft above was created using the information available at the time the story first
emerged. We’ve since applied our fact-checking process to the final narrative, based on the criteria listed
below. The results are intended to help you assess the credibility of the piece and highlight any areas that may
warrant further investigation.

Freshness check

Score:
8

Notes:
The article was published on 19 February 2026, which is within the past 7 days, indicating freshness.

Quotes check

Score:
7

Notes:
The article includes direct quotes from the Civil Justice Council’s interim report and consultation paper. However, the earliest known usage of these quotes cannot be independently verified, raising concerns about their originality.

Source reliability

Score:
8

Notes:
The article originates from Legal Futures, a reputable UK-based legal news outlet. However, the article relies on a press release from the Civil Justice Council, which may introduce bias.

Plausibility check

Score:
9

Notes:
The claims about the Civil Justice Council’s proposal align with known discussions on AI in legal practice. Similar proposals have been made in other jurisdictions, such as the New South Wales Supreme Court’s guidelines on AI use in affidavits and witness statements. ([lawyerly.com.au](https://www.lawyerly.com.au/nsw-lawyers-told-ai-cannot-be-used-to-generate-affidavits-witness-statements/?utm_source=openai))

Overall assessment

Verdict (FAIL, OPEN, PASS): PASS

Confidence (LOW, MEDIUM, HIGH): MEDIUM

Summary:
The article is recent and presents plausible claims about the Civil Justice Council’s proposal regarding AI use in drafting witness statements. However, the reliance on a press release from the Civil Justice Council raises concerns about the independence of the information. The inability to independently verify the earliest usage of direct quotes further diminishes confidence in the article’s originality. Given these factors, the overall assessment is a PASS with MEDIUM confidence.

[elementor-template id="4515"]
Share.