BBC: A New York lawyer is facing a court hearing of his own after his firm used AI tool ChatGPT for legal research.
A judge said the court was faced with an “unprecedented circumstance” after a filing was found to reference example legal cases that did not exist.
The lawyer who used the tool told the court he was “unaware that its content could be false”.
ChatGPT creates original text on request, but comes with warnings it can “produce inaccurate information”.
The original case involved a man suing an airline over an alleged personal injury. His legal team submitted a brief that cited several previous court cases in an attempt to prove, using precedent, why the case should move forward.
But the airline’s lawyers later wrote to the judge to say they could not find several of the cases that were referenced in the brief.
“Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations,” Judge Castel wrote in an order demanding the man’s legal team explain itself. MORE
I’m surprised NY courts would complain, must be the cases involved non-democrats.
Once, in grade school, I had a book report due the next day, only I hadn’t read a book. So I made up a book; title, characters, and plot, the whole thing, out of whole cloth, then submitted a report. I actually got away with it, but this was long before computers and Google. It would have been easier to read a book. I grew out of it and never did it again.
Apparently, ChatGPT has more intelligence than the lawyer.
Since the last two elections were entirely made up, what’s a couple more lawyer lies one way or another…
I’ve been on jury duty.
Facts don’t matter to most people on a jury. Its all about teh feelz.