> It is not a good analogy for Claude chats because those chats are not communication between a laywer and a client.
How is it not? I get that a chatbot is not a person with rights. And NAL.
But for all intents and purposes, it is a communication about legal advice. The way a lot of people use it is legal advice. They will continue to use it that way.
So for the law to then turn around and say that it's evidence that will be used against them is kind of messed up. It means confidentiality of your case is bought by paying a lawyer for legal protection, not because you actually need their advice over a chatbot's.
It's not a communication with a lawyer, though. Asking a guy on the street if it's illegal to sell the meth you have in your pocket is not privileged communication, and he could definitely testify about that after you got arrested!
Repeating something that you heard someone say is the literal definition of hearsay. Typically courts want to hear about facts from people who actually know those facts, not someone who heard someone talking about those facts.
This would fall under the "statement against interest" exception to hearsay, though, because obviously the person who originally said the thing isn't going to want to admit in court that they were committing a crime.
Reporting what you heard someone say is the literal definition of hearsay.
If you want to use someone saying something as evidence in court, they need to say it to the court as directly as is practical. If the person saying it isn't going to say it directly to the court, then it needs to be justified with one of the exceptions to the hearsay rule.
In this example, it would be allowed because the person saying it wouldn't be willing to admit to a crime in court.
It's a statement not offered to prove the truth of the asserted statement - non-hearsay.
It would be hearsay if offered as evidence that you had meth in your pocket. It would not if offered in evidence you were enquiring about the legality, to show intent.
It's not a communication if only one human person participates in the conversation. That's just enhanced note-taking and generating. I don't agree with the notion that talking to an LLM is disclosure to a third party because an LLM is neither a natural person nor even an artifical person recognized at law like a corporation, trust, LLC, etc.
How is it not? I get that a chatbot is not a person with rights. And NAL.
But for all intents and purposes, it is a communication about legal advice. The way a lot of people use it is legal advice. They will continue to use it that way.
So for the law to then turn around and say that it's evidence that will be used against them is kind of messed up. It means confidentiality of your case is bought by paying a lawyer for legal protection, not because you actually need their advice over a chatbot's.