What happened in ByoPlanet v. Johansson?
Attorney James Martin Paul used ChatGPT to draft legal filings across eight related cases in the Southern District of Florida. The filings contained hallucinated case citations and fabricated quotations. He was put on notice on April 25, 2025 that his citations were false. He then filed seven more submissions containing hallucinated content, including in his response to the court's own order to show cause. Judge David Leibowitz sanctioned him nearly $86,000, dismissed four federal cases without leave to amend, required him to attach the sanctions order to every filing in the district for two years, and referred him to the Florida Bar. It is the largest AI citation sanction to date.
The judge who handed down the largest AI sanctions order in US legal history opened it with a quote from the late Justice Antonin Scalia on the importance of candor in judicial proceedings.
That quote was itself generated by ChatGPT.
Judge David Leibowitz of the Southern District of Florida knew exactly what he was doing. The irony was the point. A sanctions order about fake AI-generated legal authority, opened with fake AI-generated authority. The message was impossible to miss.
On July 17, 2025, Judge Leibowitz ordered attorney James Martin Paul to pay nearly $86,000 in opposing counsel fees, dismissed four federal cases without leave to amend, required him to attach a copy of the sanctions order to every filing he makes in the Southern District for the next two years, and referred him to the Florida Bar for discipline.
It is the largest AI hallucination sanction in US legal history. And it started the same way most AI citation problems start, with a busy lawyer relying on a tool he did not fully understand, on a deadline, with a paralegal drafting.
What Actually Happened: A Verified Chronology
Factual and chronological. No editorialising. Every detail verified from the primary sources and the FindLaw sanctions order.
Consequences
- Four federal cases dismissed without prejudice and without leave to amend
- Paul ordered to pay full opposing counsel fees and costs across all cases, nearly $86,000
- Required to attach the sanctions order to every filing in the Southern District for the next two years
- Referred to the Florida Bar for discipline
“A reasonable attorney does not blindly rely on AI to generate filings. A reasonable attorney, when made aware that his practices were leading to hallucinated cases and quotations, immediately changes course.” — Judge David Leibowitz, Southern District of Florida
Note:
Judge Leibowitz also cited Versant Funding LLC v. Teras Breakbulk Ocean Navigation Enters., where two attorneys who discovered a hallucinated citation, acknowledged it immediately, and apologised to the court received significantly lighter sanctions.
How you respond after discovery still matters.
Why This Keeps Happening
Not because lawyers are careless. Because lawyers are busy.
Court decisions worldwide addressing AI hallucinations by mid-2025
New cases per day by mid-2025, up from 2/week earlier that year
The ByoPlanet figure vs. the previous AI hallucination sanction record
James Martin Paul is not unusual in his use of AI. He is unusual in the scale and persistence of his misuse after notice. But the underlying behaviour — relying on AI-generated text without independently verifying every citation — is extremely common.
The pattern in these cases is not stupidity. It is capacity. Paul was managing eight related cases simultaneously across state and federal courts. He was relying on a paralegal. He was under deadline pressure. These are the conditions under which every solo and small firm attorney works.
The enforcement mechanism that produced the $86,000 figure is important to understand. This was not a fixed fine. It was fee-shifting. Opposing counsel asked the court to award them the fees they spent dealing with Paul’s hallucinated filings. Once courts accept that time spent responding to AI-fabricated content is compensable, sanctions can scale with the volume and complexity of the litigation. That is why the ByoPlanet number is nearly ten times the previous record — and it is why even higher numbers are likely to follow.
If you have used ChatGPT, Claude, Gemini, or any general-purpose AI tool for legal research, this case applies to your practice today.
This includes work you have already filed.
Not because you would do what Paul did after notice. But because the starting condition, using AI to assist with citation research without a systematic verification workflow, is shared by a significant portion of the US bar.
The ABA’s 2025 TechReport found that 79% of lawyers report using AI tools in some capacity. The Charlotin database documents more than 700 court decisions involving AI hallucinations. The gap between those two numbers is not filled by bad lawyers. It is filled by busy lawyers, working under deadline pressure, who skipped a step they would not have skipped if they had more time.
Ask yourself these questions:
- On your last filing, did you pull every cited case and read the relevant section yourself?
- If a paralegal or associate drafted a brief using AI, did you verify each citation before you signed?
- If opposing counsel asked you tomorrow to produce documentation of your verification process for a recent filing, could you?
If the answer to any of those is uncertain, the ByoPlanet case is not someone else's problem.
The court specifically noted that Paul could not delegate his verification obligation to a paralegal. The duty of competence under ABA Model Rule 1.1 runs to the signing attorney. ABA Formal Opinion 512 makes this explicit: using AI does not lower the standard of care. Not knowing your tool hallucinates is not a defense.
What to Do Before Your Next Filing
Never use a general-purpose AI tool as your primary citation source
ChatGPT, Claude, Gemini, and Copilot generate text based on training data patterns. They do not retrieve from legal databases. They produce plausible-sounding citations that may not exist. Even when the case name is real, the quotation, the holding, or the page citation may be fabricated. The ByoPlanet record contains examples of all three. Use these tools for structuring arguments, drafting email language, summarising documents you have already read. Do not use them to find cases.
Verify every citation against the primary source before the brief leaves your desk
Pull the case. Read the relevant section. Confirm the quotation exists in the text and says what you claim it says. Confirm the holding supports the proposition you are citing it for. Confirm it is still good law. This step cannot be delegated without supervision. The court in ByoPlanet was explicit: the duty runs to the signing attorney.
Use a legal-specific AI tool that retrieves from primary sources
Tools built on retrieval-augmented generation query actual legal databases before returning results. Every citation links to the source document. You can verify in one click. This does not eliminate your verification obligation, but it eliminates the hallucination risk at the point of research. NeXa queries primary US legal databases across all 50 states and federal circuits and returns source-linked results. Every answer links to the original case, statute, or regulation. No setup required. Verification becomes a check of a link, not a search from scratch.
If you have a filing going out this week, this is the step that determines whether it holds up or gets challenged.
The $86,000 in ByoPlanet came from fee-shifting, not a fixed fine. Every minute opposing counsel spent responding to hallucinated citations was compensable.
NeXa makes citation verification part of the research workflow, not an extra step after it. No setup required. Upload and see results immediately.
The ByoPlanet sanctions did not require bad intent. They required only that verification did not happen.
NeXa makes verification automatic, not optional. Every citation links to the primary source before it leaves your research. No setup. Upload and see results immediately. Try it on your next filing.


