Published March 24, 2026 | Updated March, 2026

The $86,000 AI Mistake: What Every Lawyer Needs to Know About ByoPlanet v. Johansson

nexlaw-knowledge-center
The $86,000 AI Mistake: What Every Lawyer Needs to Know About ByoPlanet v. Johansson

What happened in ByoPlanet v. Johansson?

Attorney James Martin Paul used ChatGPT to draft legal filings across eight related cases in the Southern District of Florida. The filings contained hallucinated case citations and fabricated quotations. He was put on notice on April 25, 2025 that his citations were false. He then filed seven more submissions containing hallucinated content, including in his response to the court's own order to show cause. Judge David Leibowitz sanctioned him nearly $86,000, dismissed four federal cases without leave to amend, required him to attach the sanctions order to every filing in the district for two years, and referred him to the Florida Bar. It is the largest AI citation sanction to date.

The judge who handed down the largest AI sanctions order in US legal history opened it with a quote from the late Justice Antonin Scalia on the importance of candor in judicial proceedings.

That quote was itself generated by ChatGPT.

Judge David Leibowitz of the Southern District of Florida knew exactly what he was doing. The irony was the point. A sanctions order about fake AI-generated legal authority, opened with fake AI-generated authority. The message was impossible to miss.

On July 17, 2025, Judge Leibowitz ordered attorney James Martin Paul to pay nearly $86,000 in opposing counsel fees, dismissed four federal cases without leave to amend, required him to attach a copy of the sanctions order to every filing he makes in the Southern District for the next two years, and referred him to the Florida Bar for discipline.

It is the largest AI hallucination sanction in US legal history. And it started the same way most AI citation problems start, with a busy lawyer relying on a tool he did not fully understand, on a deadline, with a paralegal drafting.

What Actually Happened: A Verified Chronology

Factual and chronological. No editorialising. Every detail verified from the primary sources and the FindLaw sanctions order.

April 1, 2025
Paul filed complaints against Peter Johansson and Charles Gilstrap in the Southern District of Florida, Case No. 0:25-cv-60630. He used ChatGPT to draft the filings. The complaints contained hallucinated case citations and fabricated quotations.
April 25, 2025
Opposing counsel notified Paul that his citations were false. From this date, Paul was aware that his AI tool was producing hallucinated authority.
After April 25
Paul filed seven more submissions to courts containing hallucinated citations and fabricated quotations. This included his response to the court's order to show cause about the hallucinations.
May 5, 2025
Paul cited Smith v. JPMorgan Chase Bank, N.A. for the proposition that courts do not dismiss claims over citation errors. That case does not exist. The citation leads to a different, unrelated Louisiana case. He also cited Hood v. Tompkins for a quotation about Rule 11 sanctions. The quotation does not appear anywhere in Hood.
At the hearing
The court asked Paul directly whether he denied being on notice as of April 25 that his case law contained misrepresentations. He confirmed he was aware.
July 15 to 17, 2025
Judge Leibowitz issued the full sanctions order. The epigraph was a Scalia quote on candor, generated by a ChatGPT prompt dated July 7, 2025. The judge confirmed this in the order.

Consequences

  • Four federal cases dismissed without prejudice and without leave to amend
  • Paul ordered to pay full opposing counsel fees and costs across all cases, nearly $86,000
  • Required to attach the sanctions order to every filing in the Southern District for the next two years
  • Referred to the Florida Bar for discipline

“A reasonable attorney does not blindly rely on AI to generate filings. A reasonable attorney, when made aware that his practices were leading to hallucinated cases and quotations, immediately changes course.” — Judge David Leibowitz, Southern District of Florida

Note:

Judge Leibowitz also cited Versant Funding LLC v. Teras Breakbulk Ocean Navigation Enters., where two attorneys who discovered a hallucinated citation, acknowledged it immediately, and apologised to the court received significantly lighter sanctions.

How you respond after discovery still matters.

Why This Keeps Happening

Not because lawyers are careless. Because lawyers are busy.

700+

Court decisions worldwide addressing AI hallucinations by mid-2025

2-3

New cases per day by mid-2025, up from 2/week earlier that year

10x

The ByoPlanet figure vs. the previous AI hallucination sanction record

James Martin Paul is not unusual in his use of AI. He is unusual in the scale and persistence of his misuse after notice. But the underlying behaviour — relying on AI-generated text without independently verifying every citation — is extremely common.

The pattern in these cases is not stupidity. It is capacity. Paul was managing eight related cases simultaneously across state and federal courts. He was relying on a paralegal. He was under deadline pressure. These are the conditions under which every solo and small firm attorney works.

The enforcement mechanism that produced the $86,000 figure is important to understand. This was not a fixed fine. It was fee-shifting. Opposing counsel asked the court to award them the fees they spent dealing with Paul’s hallucinated filings. Once courts accept that time spent responding to AI-fabricated content is compensable, sanctions can scale with the volume and complexity of the litigation. That is why the ByoPlanet number is nearly ten times the previous record — and it is why even higher numbers are likely to follow.

This includes work you have already filed.

Not because you would do what Paul did after notice. But because the starting condition, using AI to assist with citation research without a systematic verification workflow, is shared by a significant portion of the US bar.

The ABA’s 2025 TechReport found that 79% of lawyers report using AI tools in some capacity. The Charlotin database documents more than 700 court decisions involving AI hallucinations. The gap between those two numbers is not filled by bad lawyers. It is filled by busy lawyers, working under deadline pressure, who skipped a step they would not have skipped if they had more time.

Ask yourself these questions:

  1. On your last filing, did you pull every cited case and read the relevant section yourself?
  2. If a paralegal or associate drafted a brief using AI, did you verify each citation before you signed?
  3. If opposing counsel asked you tomorrow to produce documentation of your verification process for a recent filing, could you?

If the answer to any of those is uncertain, the ByoPlanet case is not someone else's problem.

The court specifically noted that Paul could not delegate his verification obligation to a paralegal. The duty of competence under ABA Model Rule 1.1 runs to the signing attorney. ABA Formal Opinion 512 makes this explicit: using AI does not lower the standard of care. Not knowing your tool hallucinates is not a defense.

What to Do Before Your Next Filing

1

Never use a general-purpose AI tool as your primary citation source

ChatGPT, Claude, Gemini, and Copilot generate text based on training data patterns. They do not retrieve from legal databases. They produce plausible-sounding citations that may not exist. Even when the case name is real, the quotation, the holding, or the page citation may be fabricated. The ByoPlanet record contains examples of all three. Use these tools for structuring arguments, drafting email language, summarising documents you have already read. Do not use them to find cases.

2

Verify every citation against the primary source before the brief leaves your desk

Pull the case. Read the relevant section. Confirm the quotation exists in the text and says what you claim it says. Confirm the holding supports the proposition you are citing it for. Confirm it is still good law. This step cannot be delegated without supervision. The court in ByoPlanet was explicit: the duty runs to the signing attorney.

3

Use a legal-specific AI tool that retrieves from primary sources

Tools built on retrieval-augmented generation query actual legal databases before returning results. Every citation links to the source document. You can verify in one click. This does not eliminate your verification obligation, but it eliminates the hallucination risk at the point of research. NeXa queries primary US legal databases across all 50 states and federal circuits and returns source-linked results. Every answer links to the original case, statute, or regulation. No setup required. Verification becomes a check of a link, not a search from scratch.

If you have a filing going out this week, this is the step that determines whether it holds up or gets challenged.

The $86,000 in ByoPlanet came from fee-shifting, not a fixed fine. Every minute opposing counsel spent responding to hallucinated citations was compensable.

NeXa makes citation verification part of the research workflow, not an extra step after it. No setup required. Upload and see results immediately.

The ByoPlanet sanctions did not require bad intent. They required only that verification did not happen.

NeXa makes verification automatic, not optional. Every citation links to the primary source before it leaves your research. No setup. Upload and see results immediately. Try it on your next filing.

FAQ

Frequently Asked Questions

Explore answers to frequently asked questions about Nexlaw

What happened in ByoPlanet v. Johansson?

Attorney James Martin Paul used ChatGPT to draft legal filings across eight related cases in the Southern District of Florida. The filings contained hallucinated case citations and fabricated quotations. He was notified on April 25, 2025 that his citations were false, then filed seven more submissions containing hallucinated content. Judge David Leibowitz sanctioned him nearly $86,000 in opposing counsel fees, dismissed four federal cases without leave to amend, required him to attach the sanctions order to every filing in the district for two years, and referred him to the Florida Bar. It is the largest AI citation sanction to date. The full sanctions order is publicly available.

What was the $86,000 sanction in ByoPlanet based on?

The $86,000 figure was not a fixed fine. It was a fee-shifting award, representing full reimbursement of opposing counsel's fees for time spent responding to AI-generated hallucinations across the related cases. Courts increasingly use fee-shifting as the enforcement mechanism because it scales with the harm caused. This is why the ByoPlanet figure is nearly ten times earlier than AI hallucination sanctions and why even higher numbers are expected to follow.

Can I be sanctioned for using ChatGPT for legal research?

You can be sanctioned for filing unverified AI-generated citations, regardless of which tool produced them. ABA Formal Opinion 512 (July 2024) makes clear that using AI does not lower the standard of care. The duty to verify every citation you file runs to the signing attorney. If you use ChatGPT or another general-purpose AI tool and cite cases without independently verifying each one against the primary source, you carry the same risk as Attorney Paul in ByoPlanet.

What does ABA Formal Opinion 512 say about AI citation verification?

ABA Formal Opinion 512 (July 2024) establishes that existing professional duties apply to AI-generated work. Model Rule 1.1 (competence) requires attorneys to understand the limitations of AI tools. Model Rule 3.3 (candor toward the tribunal) requires that citations submitted to court be accurate. The opinion does not prohibit AI use. It requires attorneys to verify all AI-generated output before relying on it professionally, and establishes that ignorance of AI's limitations is not a defense.

What is the difference between ChatGPT and a legal AI tool for citation research?

ChatGPT generates text based on training data patterns. It does not retrieve from legal databases. It produces citations that look plausible but may not exist. Legal AI tools built on retrieval-augmented generation query primary legal databases before generating a response. Every citation links to the source document. NeXa is built on this architecture. In the ByoPlanet case, every hallucinated citation could have been caught before filing if it had been generated by a tool that retrieves from primary sources rather than generating synthetic text.

What happens if a paralegal uses AI and the partner signs the filing?

The signing attorney bears full responsibility. Judge Leibowitz addressed this directly in ByoPlanet: the court stated that a reasonable attorney does not rely on a paralegal to draft a filing without supervising and verifying the output. The duty of competence and candor under the Model Rules runs to the attorney whose signature appears on the filing. Delegation of the drafting task does not delegate the verification obligation.

Enjoying this post?

Subscribe to our newsletter to get the latest updates and insights.

© 2026 NEXLAW INC.

AI Legal Assistant | All Rights Reserved.

ISO 27001 certified information security management system ISO 27001 Certified
GDPR compliant data protection and privacy standards GDPR Compliant
HIPAA compliant security for sensitive legal and health data HIPAA Compliant
SOC 2 Type II certified security and compliance controls Type II Certified

NexLaw is a SOC 2 Type II compliant platform utilizing AES-256 encryption. Our zero-data retention policy for enterprise users ensures that your work product remains privileged and is never used to train our models.

NEXLAW AI