Behrens and Bennett: Proposed Federal Rule Could Help Curb AI Missteps

As lawyers increasingly turn to generative AI to aid in their work, courts are beginning to respond to the issue of hallucinations—which can include both fictitious cases and real cases with fictitious language attributed to them—in court filings. To combat the issue, Shook Public Policy Practice Group Co-Chair Mark Behrens and Associate Jacob Bennett propose an amendment to Rule 11(a) of the Federal Rules of Civil Procedure that would require attorneys to certify that all content in a document containing citations that is filed with a court, created with the assistance of AI, has been accurately cited. 

Behrens and Bennett discuss the proposed rule change in an Expert Analysis article for Law360 titled “A Uniform Federal Rule Would Curb Gen AI Missteps in Court.” In the article, they propose the following certification: 

I certify that I or an individual under my supervision has reviewed all content in this filing created with the assistance of generative artificial intelligence to ensure that the authorities cited therein exist, have been accurately cited, and support the propositions for which they have been cited. I further acknowledge that the court may take corrective action, including imposition of sanctions, if hallucinated content is submitted to the court.

“This language would put attorneys on notice as to their obligations and deter blind reliance on generative AI, without deterring the use of potentially beneficial tools,” they say. “The proposed rule would also remind judges of the pitfalls of AI and encourage best practices, potentially allowing members of the judiciary to avoid embarrassing missteps too."

Read the article at Law360 >>