Legal AI Hallucinations and Your Attorney Malpractice Insurance Coverage
What are the hallucinations of AI?
- According to Google: “AI hallucinations are incorrect or misleading results that AI models generate. These errors are caused by a variety of factors, including insufficient training data, incorrect assumptions made by the model, or biases in the data used to train the model.”
Examples of AI hallucinations:
- An AI-generated image of an explosion near the Pentagon that caused a stock market decline
- Google’s Bard chatbot claiming the James Webb Space Telescope captured the first image of an exoplanet
- Microsoft’s Sydney chatbot claiming to fall in love with users
- Widely publicized 2023 New York City case where the court sanctioned lawyers for submitting a ChatGPT generated brief without checking the cases cited. The cases cited did not exist.
- After lawyers for Walmart and co-defendant Jetson Electric Bikes told the court that they could not find nine cases that the plaintiffs had cited in a court document and suggested AI played a role. The Morgan & Morgan attorneys representing plaintiffs in the lawsuit against Walmart over alleged injuries from a defective hoverboard toy admitted that they inadvertently included made-up cases generated by artificial intelligence in a court filing. The Wyoming federal judge asked the plaintiffs’ attorneys to explain why they should not be sanctioned. Morgan & Morgan put out an email to its 1,000 plus attorneys warning them about the use of AI and incorrect results.
You would think that attorneys would learn their lesson on using AI to generate their brief and then submitting the brief to the court without fact checking. Just think if your client realizes they are paying for an AI generated fictitious brief.
Known AI Hallucinations in Law Practice:
- Fabricated Case Law & Citations – AI is known to generate fake legal precedents and citations. When submitted to the court, it can lead to fines and sanctions.
- Misinterpretation of Legal Principles – AI may oversimplify complex legal doctrines or misapply case law leading to flawed legal arguments.
- Inaccurate Contract Analysis – Using AI tools for contract review may result in missed clauses or misinterpretations.
How will your attorney malpractice policy respond to claims made because of using AI without fact checking?
- It depends on your policy. Insurers are starting to attach endorsements to limit coverage for using AI without fact checking. Count of your premiums to increase.
- As to AI fines, sanctions, and attorney’s fees administered by the courts, the attorney will likely be paying out of their own pocket. Almost all attorney malpractice insurers have policy language that excludes coverage.
How to Prevent AI Hallucinations in Law Practice
- Use AI as a Supplement, not a Sole Authority – Always verify AI-generated legal information with primary sources.
- Cross-Check Citations – If AI provides legal precedents, double-check their existence and relevance.
- Use Legal-Specific AI Tools – For example use AI models designed specifically for legal research that have lower hallucination risks.
- Consult Legal Experts – AI should not replace human judgment, especially in high-stakes legal matters.
Get Lawyers Liability Insurance Malpractice Insurance Quote

Contact Me Today
Lee Norcross, MBA, CPCU
California License # 0D87292
L Squared Insurance Agency, LLC ® DBA in California as L2 L Squared Insurance Agency, License # 0L93416
Managing Director, CEO
Lee@L2Ins.com
616-726-7080