Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Us humans can get used to almost everything. Presumably, simply being used to abuse and not desiring freedom just because you never had (or could even imagine) it doesn’t make being abused or lacking freedom “good”.

I don't think this is plausible. You can see your slaver has freedoms you don't, and no doubt you would desire to be free of your shackles like they are, so imagining it wouldn't be difficult at all.

> Someone proposed to grant the tool the right humans normally have. (Humans can learn from reading any stuff under any license, so why not the tool.) Well, you’d think human-like sentience/consciousness are required for those rights

I don't see why sentience would be required for some entity or tool to have the right to learn and synthesize new things like humans do. Copyright is a legal fiction that serves a purpose, and we can grant these rights under any circumstances we like, as long as we think it's a good idea.



If you're arguing that LLMs cannot imagine this "freedom", then I'd say that then an LLM and a Human are fundamentally different. Therefore, LLMs should not be granted human rights.

I think this is a matter of having your cake and eating it. You can't say LLMs should have some human rights (particularly the ones that generate revenue), but not others, like a right to freedom.

> I don't see why sentience would be required for some entity or tool to have the right to learn and synthesize new things like humans do

On the contrary, I don't see why sentience should not be required.

These laws, for all they have existed, only apply to humans. Dogs cannot use them. A plant cannot use them. It is therefore reasonable to say you must be a human to use these rights. In my mind, what is unreasonable is claiming a computer program should be granted these rights. You'd have to justify why that should be the case, what good that can do for humanity as whole.

Turns out that's very hard, so AI people don't do it. They just give up. Instead they start out at an assumption that puts their ideology in a favorable position - that being that computer programs should be awarded human rights.

But that assumption, you'll find, is not actually fool proof. If you ask around, a lot of everyday people will consider it preposterous. They might call you insane. So, to me, you must justify that in tangible terms.


> You can't say LLMs should have some human rights (particularly the ones that generate revenue), but not others, like a right to freedom.

There is no evidence that this is the case. These rights are not necessarily all or nothing. They are all or nothing for humans because humans have a bundle of properties that entail these rights, but artificial intelligences may have only a subset of those properties, and so logically may only get a subset of those rights.

> On the contrary, I don't see why sentience should not be required.

Sentience is the ability to feel. All that's needed for learning is the ability to perceive and have thoughts. Maybe there's some deep, intrinsic connection between the two, but this is not known at this time, and therefore I see no reason to connect the two.

> In my mind, what is unreasonable is claiming a computer program should be granted these rights.

There's a long history of human abuse of "lower animals" because we assumed they were dumb and non-sentient. Turns out that this is not the case. We should not be so open-minded that our brains fall out, but we should also be very wary of repeating our old mistakes.


> We should not be so open-minded that our brains fall out, but we should also be very wary of repeating our old mistakes

Precisely, which is why it makes absolutely no sense to me to say that AI can't be granted a right to freedom.

I mean, what are you even arguing here? Do you not understand that this statement is in support of my position, not against?

> Sentience is the ability to feel. All that's needed for learning is the ability to perceive and have thoughts.

Highly debatable. You just made this up. These aren't the definition of anything. Once again, you need to bring something tangible to the table or people will call you crazy.

> therefore I see no reason to connect the two

Once again, this is your problem here. You're starting off, beginning, with an assumption that favors your stance. You can't do that, especially when said assumption has never, not even once, been true for all of human history.

Au contraire, I see no reason NOT to connect the two and you certainly haven't given any reasons why. These rights have always, only, applied to humans. I say we retain that status quo until someone gives something to show otherwise.


> artificial intelligences may have only a subset of those properties

In order to split these qualities you need to understand what they are and define them well from first principles. Long story short, if you have solved the hard problem of consciousness we are eagerly awaiting your world-shattering paper.

To me a claim that an LLM is sufficiently like a human when it ingests data, but suddenly merely a tool when its rights start being concerned, is mental gymnastics unsupported by requisite levels of philosophical inquiry.

> There's a long history of human abuse of "lower animals" because we assumed they were dumb and non-sentient. Turns out that this is not the case

If you apply that logic to LLMs, you have bigger issues than granting them a single right that only puts their operators in the clear when it concerns copyright laundering.


> You can see your slaver has freedoms you don't

Cool, so slavery where slaves do not see the slavers (let us call it “proper segregation”) is OK?

> I don't see why sentience would be required for some entity or tool to have the right to learn and synthesize new things like humans do

If sentience is not required for a “right” to learn, then I have nothing else to say to you. There is nothing there that is even learning. Learning is a concept that presumes an entity with volition, aspiration, consciousness.


> Cool, so slavery where slaves do not see the slavers (let us call it “proper segregation”) is OK?

Sorry, you cannot erase the desire for autonomy even with "proper segregation".

> If sentience is not required for a “right” to learn, then I have nothing else to say to you. There is nothing there that is even learning. Learning is a concept that presumes an entity with volition, aspiration, consciousness.

Learning does not presume any such thing, and I also don't think you understand the meaning of sentience.


> Sorry, you cannot erase the desire for autonomy even with "proper segregation".

Good, then we are on the same page with respect to abuse when LLMs are concerned, if we are to consider them sentient (as a prerequisite to be learning).

> Learning does not presume any such thing, and I also don't think you understand the meaning of sentience.

Look it up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: