How’d you react if you ordered a delightful gift from a “discount” website to be delivered to a good friend for their birthday…but they instead received a gift box filled with dirty, smelly rags (and your name on the gift note)?
When a self-represented litigant or attorney asks AI (artificial intelligence) tools like ChatGPT, Claude, Gemini, Perplexity, or others for legal research and drafting help for something to be filed with a court—and fails to verify the citations, summaries, and arguments for completeness or accuracy—there’s a high risk they’ll be sending the court fake case names, citations, legal summaries, or outdated information. In other words, the court receives dirty, smelly rags.

Like your friend’s confusion in receiving a birthday gift of dirty rags, the judge will not have positive thoughts about an unreliable court filing drafted by AI.
What can happen? Judges take different approaches to the consequences.
A warning or caution about future filings is the mildest consequence, like federal district court judge Michael H. Simon’s decided in Schoene v Oregon Dept of Human Services, opinion and order of the United States District Court for the District of Oregon, issued June 25, 2025 (Case No. 23-cv-00742), p 13-14):
Before addressing the merits of Schoene’s motion, the Court notes that Schoene cited several cases in her reply brief to support her motion to amend, including Butler v. Oregon, 218 Or. App. 114 (2008), Curry v. Actavis, Inc., 2017 LEXIS 139126 (D. Or. Aug. 30, 2017), Estate of Riddell v. City of Portland, 194 Or. App. 227 (2004), Hampton v. City of Oregon City, 251 Or. App. 206 (2012), and State v. Burris, 107 Or. App. 542 (1991). These cases, however, do not exist. Schoene’s false citations appear to be hallmarks of an artificial intelligence (“AI”) tool, such as ChatGPT. It is now well known that AI tools “hallucinate” fake cases. See Kruse v. Karlen, 692 S.W.3d 43, 52 (Mo. Ct. App. 2024) (noting, in February 2024, that the issue of fictitious cases being submitted to courts had gained “national attention”). In addition, the Court notes that a basic internet search seeking guidance on whether it is advisable to use AI tools to conduct legal research or draft legal briefs will explain that any legal authorities or legal analysis generated by AI needs to be verified. The Court cautions Schoene that she must verify the accuracy of any future citations she may include in briefing before this Court and other courts.
Federal district court judge Terrence G. Berg issued a similar caution in Hunt v Alanis Nadine Morissette and Epiphany Productions, Inc, order denying motion for reconsideration of the United States District Court for the Eastern District of Michigan, issued June 11, 2025 (Case No. 24-cv-12947), p 1-2:
As a preliminary matter, the Court notes that Plaintiff appears to base this argument on questions that he posed to an artificial intelligence (“AI”) program, and the answers that program provided. ECF No. 24-1, PageID.280-82. AI programs “are based on complex mathematical systems that learn their skills by analyzing enormous amounts of digital data. They do not — and cannot — decide what is true and what is false. Indeed, at times, they just make stuff up—a phenomenon some A.I. researchers call hallucinations.” Cade Metz & Karen Weise, A.I. Is Getting More Powerful, but Its Hallucinations Are Getting Worse, N.Y. Times (May 6, 2025), https://www.nytimes.com/2025/05/05/technology/ai-hallucinations-chatgpt-google.html. As such, some courts have rejected the use of such AI programs in legal pleadings “out of hand”. J.G. v. New York City Dep’t of Educ., 719 F. Supp. 3d 293, 308 (S.D.N.Y. 2024). Because AI- generated material is unlikely to be helpful to Plaintiff, the Court directs that in the event Plaintiff should file any additional pleadings, he should not include or rely upon AI-generated material.
An attorney was similarly warned by federal district court judge William C. Griesbach in Arajuo v Wedelstadt et al, decision and order denying motion for summary judgment of the United States District Court for the Eastern District of Wisconsin, issued January 22, 2025 (Case No. 23-cv-1190):
To the extent counsel used an artificial intelligence tool (e.g., ChatGPT) that generated fake case citations, this is unacceptable. Counsel is warned that any future filings with citations to nonexistent cases may result in sanctions. The court now turns to the merits of Defendants’ motion.
The court strikes the court filing(s) from the record. Federal district court judge Robert E. Payne struck six filings from the record filed by a self-represented litigant because they were filled with fake cases. The litigant was allowed to submit replacement filings that followed the court rules. Powhatan County School Bd v Skinger and Lucas, memorandum opinion of the United States District Court for the Eastern District of Virginia, issued June 2, 2025 (Case No. 24-cv-00874).
The party loses the case. The Illinois Court of Appeals found that a self-represented litigant’s only two cited citations were incorrect, possibly the result of AI, and the litigant failed to preserve their argument for appellate review. In re Marriage of Isom, 2025 IL App (3d) 240491-U, p 4-5
In addition, Gbolahan’s appellate brief mentions only two case decisions, both of which are incorrectly cited. The first, In re Marriage of Zells, cited at 138 Ill. 2d 437 (1990), is cited for the simple proposition that the trial court’s decision must be fair and reasonable. We note that the correct citation for that case is 143 Ill. 2d 251 (1991) and that the decision does not assert the proposition for which it is cited. We were unable to locate the second case cited by Gbolahan, In re Marriage of Amland, 2015 IL App (1st) 142683. The brief reads as though Gbolahan conducted his legal research through ChatGPT rather through reliable resources like Westlaw or Lexis.
The party lost its case, but the Illinois Court of Appeals did not impose any other sanctions.
The party loses and must pay damages. The Missouri Court of Appeals Eastern District found that the self-represented party’s filings included “numerous fatal briefing deficiencies” and fake AI-generated cases. The court dismissed the appeal, and the self-represented party was ordered to pay the other side $10,000 for filing a frivolous appeal. Kruse v Karlen, opinion, Mo Ct App ED, issued February 13, 2024 (Case No. ED111172).
The party can be show caused. The Vermont Superior Court took notice that a self-represented party was previously warned against using fake case citations and quotations in their court filing. When the self-represented party continued to file pleadings that included fake case citations and quotations, the court scheduled an attorney’s fee hearing to show cause why sanctions should not be imposed. Lafayette v Abrami, order on pending motions, Vermont Superior Court, issued May 20, 2025 (Case No. 25-CV-00624).
The attorney(s) and party can be show caused. Federal district court judge Nina Y. Wang ordered several attorneys and their clients to show cause why (1) the clients should not be sanctioned, and (2) the attorneys should not be referred for disciplinary proceedings because a motion in liminie was prepared with AI and included fake citations (replete with fundamental errors). Coomer v Lindell, et al, order to show cause of the United States District Court of Colorado, issued May 23, 2025 (Case No. 22-cv-01129). [Update: Judge Yang imposed two separate $3,000 sanctions against the attorneys in a strongly worded 20-page order on July 7, 2025.]
The filing attorney can be fined (contempt of court), ordered to complete some continuing legal education, and ordered to self-report to the disciplinary boards of all the bars the attorney is a member. These were some of the sanctions federal magistrate judge Damian L. Martinez ordered in Dehghani v Castro, memorandum opinion and order on sanctions and other disciplinary action of the United States District of New Mexico, issued April 2, 2025 (Case No. 25-cv-00052).
Federal judge Kai N. Scott only fined the attorney and ordered continuing legal education, but the public memorandum is blistering in its finding that the attorney submitted briefs to the court that (1) cited non-existing cases, (2) cited case law that did not support the stated proposition, and (3) cited cases that were vacated or overruled. Bunce v Visual Technology Innovations, Inc, memorandum of the United States District Court for the Eastern District of Pennsylvania, issued February 27, 2025 (Case No. 23-cv-01740).
AI hallucination cases are the stinky, dirty rags of the courts’ work. Because judges have different approaches to addressing them, it’s a good practice to bookmark Damien Charlotin’s tracking database at https://www.damiencharlotin.com/hallucinations/