Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A toaster cannot perform the task of making toast any more than an Allen key can perform the task of assembling flat pack furniture.


Let me understand, is your claim that a toaster can't toast bread because it cannot initiate the toasting through its own volition?

Ignoring the silly wording, that is a very different thing than what robotresearcher said. And actually, in a weird way I agree. Though I disagree that a toaster can't toast bread.

Let's take a step back. At what point is it me making the toast and not the toaster? Is it because I have to press the level? We can automate that. Is it because I have to put by bread in? We can automate that. Is it because I have to have the desire to have toast and initiate the chain of events? How do you measure that?

I'm certain that's different from measuring task success. And that's why I disagree with robotresearcher. The logic isn't self consistent.


> Though I disagree that a toaster can't toast bread.

If a toaster can toast bread, then an Allen key can assemble furniture. Both of them can do these tasks in collaboration with a human. This human supplies the executive decision-making (what when where etc), supplies the tool with compatible parts (bread or bolts) and supplies the motivating force (mains electricity or rotational torque).

The only difference is that it's more obviously ridiculous when it's an inanimate hunk of bent metal. Wait no, that could mean either of them. I mean the Allen key.

> Let's take a step back. At what point is it me making the toast and not the toaster?

I don't know exactly where that point is, but it's certainly not when the toaster is making zero decisions. It begins to be a valid question if you are positing a hypothetical "smart toaster" which has sensors and software capable of achieving toasting perfection regardless of bread or atmospheric variables.

> Is it because I have to press the level? We can automate that.

You might even say automatic beyond belief.


  > I don't know exactly where that point is, but it's certainly not when the toaster is making zero decisions.
And this is the crux of my point. Our LLMs still need to be fed prompts.

Where the "decision making" happens gets fuzzy, but that's true in the toaster too.

Your run of the mill toaster is a heating element and a timer. Is the timer a rudimentary decision process?

A more modern toaster is going to include a thermocouple or thermister to ensure that the heating elements don't light things on fire. This requires a logic circuit. Is this a decision process? (It is entirely deterministic)

A more advanced one is going to incorporate a PID controller, just like your oven. It is deterministic in the sense that it will create the same outputs given the same inputs but it is working with non-deterministic inputs.

These PIDs can also look a lot like small neural networks, and in some cases they are implemented that way. These processes need not be deterministic. You can even approach this problem through RL style optimizations. There's a lot of solutions here.

When you break this down, I agree, it is hard to define that line, especially as we break it down. But that's part of what I'm after with robotresearcher. The claim was about task performance but then the answer with a toaster was that the human and toaster work together. I believe dullcrisp used the toaster as an example because it is a much simpler problem than playing a game of chess (or at least it appears that way).

So the question still stands, when does the toaster make the toast and when am I no longer doing so?

When is the measurement attributed to the toaster's ability to make toast vs mine?

Now replace toasting with chess, programming, music generation, or anything else that we have far less well defined metrics for. Sure, we don't have a perfect definition of what constitutes toast, but it is definitely far more bound than these other things. We have accuracy in the definition, and I'd argue even fairly good precision. There's high agreement on what we'd call toast, not toasted bread, and burnt bread. We can at least address the important part of this question without infinite precision in how to discriminate these classifications.


The question of an "ability to make toast" is a semantic question bounded by what you choose to encompass within "make toast". At best, a regular household toaster can "make heat"[1]. A regular household toaster certainly cannot load itself with bread, which I would consider unambiguously within the scope of the "make toast" task. If you disagree, then we have a semantic dispute.

This is also, at least in part, the Sorites Paradox.[0] There is obviously a gradient of ambiguity between human and toaster responsibility, but we can clearly tell extremes apart even when the boundary is indeterminate. When does a collection grains become a heap? When does a tool become responsible for the task? These are purely semantic questions. Strip away all normative loading and the argument disappears.

[0] https://en.wikipedia.org/wiki/Sorites_paradox

[1] Yada yada yada first law of thermodynamics etc


You and the toaster made toast together. Like you and your shoes went for a walk.

Not sure where you imagine my inconsistency is.


That doesn't resolve the question.

  > Not sure where you imagine my inconsistency is.

  >> Let's take a step back. At what point is it me making the toast and not the toaster? Is it because I have to press the level? We can automate that. Is it because I have to put by bread in? We can automate that. Is it because I have to have the desire to have toast and initiate the chain of events? How do you measure that?
You have a PhD and 30 years of experience, so I'm quite confident you are capable of adapting the topic of "making toast" to "playing chess", "doing physics", "programming", or any similar topic where we are benchmarking results.

Maybe I've (and others?) misunderstood your claim from the get-go? You seem to have implied that LLMs understand chess, physics, programming, etc because of their performance. Yet now it seems your claim is that the LLM and I are doing those things together. If your claim is that a LLM understands programming the same way a toaster understands how to make toast, then we probably aren't disagreeing.

But if your claim is that a LLM understands programming because it can produce programs that yield a correct output to test cases, then what's the difference from the toaster? I put the prompts in and pushed the button to make it toast.

I'm not sure why you imagine the inconsistency is so difficult to see.


When did I say that the chess program was different to a toaster? I don’t believe it is, so it’s not a thing I’m likely to say.

I don’t think the word ‘understand’ has a meaning that can apply in these situations. I’m not saying the toaster or the chess program understands anything, except in the limited sense that some people might describe them that way, and some won’t. In both cases that concept is entirely in the head of the describer and not in the operation of the device.

I think the claimed inconsistency is in views you ascribe to me, and not those I hold. ‘Understand’ is a category error with respect to these devices. They neither do or don’t. Understanding is something an observer attributes for their own reasons and entails nothing for the subject.


I concur that ascribing understanding to the machines that we have is a category error.

The reason I believe it was brought up is that understanding is not a category error when ascribed to people.

And if we claim to have a plan to create machines that are indistinguishable from people, we likely first need to understand what it is that makes people distinguishable from machines, and that doesn’t seem to be on any of the current AI companies’ roadmap.


Declaring something as having "responsibility" implies some delegation of control. A normal toaster makes zero decisions, and as such it has no control over anything.


A toaster has feedback control over its temperature, time control over its cooking duration, and start/stop control by attending to its start/cancel buttons. It makes decisions constantly.

I simply can't make toast without a toaster, however psychologically primary you want me to be. Without either of us, there's no new toast. Team effort every time.

And to make it even more interesting, the same is true for my mum and her toaster. She does not understand how her toaster works. And yet: toast reliably appears! Where is the essential toast understanding in that system? Nowhere and everywhere! It simply isn't relevant.


> A toaster has feedback control over its temperature, time control over its cooking duration

Most toasters are heating elements attached to a timer adjusted by the human operator. It doesn’t have any feedback control. It doesn’t have any time control.

> I simply can't make toast without a toaster

I can’t make toast without bread either, but that doesn’t make the bread “responsible” for toasting itself.

> She does not understand how her toaster works.

My mum doesn’t understand how bread is made, but she can still have the intent to acquire it from a store and expose it to heat for a nominal period of time.


  > I simply can't make toast without a toaster
You literally just put bread on a hot pan.


So despite passing the Toasting Test, a hot pan is not really a toaster?

It’s clear that minds are not easily changed when it comes to noticing and surrendering folk psychology notions that feel important.


You said you couldn't make toast without a toaster. Sorry, if I didn't understand what you actually meant


Does this mean an LLM doesn’t understand, but an LLM automated by a CRON Job does?


Just like a toaster with the lever jammed down, yes!


I mean, that was the question I was asking... If it wasn't clear, my answer is no.


This is contrary to my experience with toasters, but it doesn’t seem worth arguing about.


How does your toaster get the bread on its own?


It’s only responsible for the toasting part. The bread machine makes the bread.


If the toaster is the thing that “performs the task of making toast”, what do you call it when a human gets bread and puts it in a toaster?


I guess we could call it delegation?


“Hey man, I’m delegating. Want a slice?”


Hi delegating! No, I but I'd like some toast


Can’t help you with that, I’m not a toaster.


Seems more like dependency injection. :p


What is your definition of "responsible"? The human is making literally all decisions and isn't abdicating responsibility for anything. The average toaster has literally one operational variable (cook time) and even that minuscule proto-responsibility is entirely on the human operator. All other aspects of the toaster's operation are decisions made by the toaster's human designer/engineer.


How do you get bread? Don't tell me you got it at the market. That's just paying someone else to get it for you.


  >  That's just paying someone else to get it for you.
We can automate that too![0]

[0] https://news.ycombinator.com/item?id=45623154

(Your name is quite serendipitous to this conversation)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: