Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do LLMs "think"? I have trouble with the title, claiming that LLMs have thoughts.


What's "thinking"?


Why the need to anthropomorphize AI? Why does AI think vs process or interpret or apply previous calculated statistical weights or anything other than think?

I would argue that binary systems built on silicon are fundamentally different that human biology and deserve to be described differently, not forced into the box of human biology.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: