Technological Tangle in Cohen’s Legal Proceedings: AI’s Role in a Legal Blunder

January 4, 2024
1 min read
technological-tangle-in-cohen's-legal-proceedings-ai's-role-in-a-legal-blunder

In a recent and unexpected twist in the legal saga involving Michael Cohen, former lawyer of Donald Trump, the realm of law has been shaken by an incident involving artificial intelligence and its integration into legal processes. This episode, which played out in a federal court in Manhattan, highlights the increasingly significant role of AI in various industries, including the legal sector.

Cohen, who had been convicted of multiple charges, including tax evasion and lying to Congress, was under judicial supervision post-incarceration. Seeking to end this supervision prematurely, his lawyer, David M. Schwartz, filed a motion with the court. However, this motion contained fabricated legal case references, a fact Cohen was unaware of at the time of filing.

The source of these fictitious citations was Google Bard, an AI tool similar to ChatGPT. Cohen, mistaking this for a sophisticated search tool, did not recognize that Google Bard, like its counterpart in Microsoft’s Bing, can create non-factual content, a glitch often termed “hallucination” in AI terminology. This error led to the inclusion of non-existent legal cases in his motion.

The issue surfaced when Judge Jesse Furman, overseeing the case, questioned the origin of these dubious citations. Cohen disclosed that he had resorted to online tools for legal research due to his disbarment and subsequent loss of access to standard legal resources. His inadvertent use of Google Bard, a recent entrant in the AI market, resulted in this peculiar legal faux pas.

The blame initially fell on Schwartz, Cohen’s attorney and friend, for not verifying these citations. However, Cohen has sought leniency for Schwartz, arguing that the mistake was unintentional. 

Adding to the drama, Schwartz believed that another attorney of Cohen, E. Danya Perry, had reviewed the document drafts, a claim Perry vehemently denies. Upon discovering the false citations, Perry immediately informed the judge and the prosecutors.

This is not an isolated incident. Another case earlier in the year in the same Manhattan federal court dealt with a similar problem involving lawyers who were penalized for citing fictitious cases produced by ChatGPT.

This development intersects with Trump’s own legal troubles. In a separate case in New York, Trump faces 34 felony charges in relation to his private company’s business records, stemming from secret payment arrangements. He has pleaded not guilty in this and three other criminal cases, dismissing them as politically motivated efforts to thwart his potential 2024 presidential run.

Cohen’s accidental mishap with AI-generated legal citations underscores the challenges and hazards of new technology in professional fields. It also adds another dimension to the ongoing legal narratives involving Trump and his circle. This incident serves as a warning about the importance of careful scrutiny and verification in the era of digital technology.

Latest from Blog

withemes on instagram

[instagram-feed feed=1]