AI (Artificial Intelligence) and ChatGPT are some of the buzzwords in life that didn’t exist in most conversations as recently as just a couple of years ago. We are ignoring the lessons of the “Terminator” movies and AI is here to stay.

I get hit up from so many companies trying to incorporate AI into my practice. In many ways it feels like the early 2000’s when people were trying to sell websites or online businesses with no real evaluation as to if they are good at their job. People love the shiny new thing and don’t want to miss out.

That’s not to say that AI couldn’t have its uses, but I’d venture to guess that time is down the road. Right now, in my opinion, it seems like taking legal advice from a first year law student.

I recently got my first phone call from someone looking for legal help who had formed an opinion via an AI app and wanted to confirm if it was correct. Unfortunately AI got almost everything wrong. The danger of course is that people are going to rely on these apps and assume that they are getting correct advice.

The law is different in every State and constantly changing. But the bigger issue as far as I’ve seen is that AI relies on you to present all of the relevant facts and can’t be counted on to ask things that an experienced attorney would ask. AI also isn’t likely to recognize other legal issues that you aren’t thinking of that would only be discovered by asking probing questions.

At least once a week for example I talk to someone about wrongful termination or medical malpractice and discover that their best case is actually for workers’ compensation due to a work related injury. Many of those callers will tell me they don’t have a work comp claim, but it’s only when I explain how work comp law actually works in Illinois that they realize they do have that option.

One company has an AI feature that says they will predict the likelihood of success in your case. That seems like the biggest bit of horse crap I’ve ever heard. There are so many factors in every case that a computer can’t know, namely what the other side is going to say. I can tell you why I think I should have custody of my child, but I’m not going to tell you what my ex is going to say because I can’t speak for them. Or I could tell you what happened to me at a hospital, but if you don’t have access to my medical records, there’s no reasonable way to predict if we’ll win the case.

And none of this even considers who the Judge is on a case and how they tend to think. If you think that a computer can indicate what a Judge is going to do, I’ve got some stories of Judges falling asleep in trial or being inebriated or being in a terrible mood for personal reasons that would indicate otherwise.

I think AI at its best right now is a better search engine than existed before, but if it’s used as anything other than a supplemental tool for an attorney, it’s dangerous. Oh and don’t get me started on law firms who use an AI tool for their live chats to help generate leads. That’s just a way to make you less customer service focused and less helpful to the people who need you.