This is your mistake right here. It doesn't think. It's a text generator. It can no more think about what year it is than Swiftkey on your phone "thinks" what year it is when you type
I'm as bearish as anyone on the current AI hype, but this particular ship has sailed. Research is revealing these humongous neural networks of weights for next token prediction to exhibit underlying structures that seem to map in some way to a form of knowledge about the world that is, however imperfectly, extracted from all the text they're trained on.
Arguing that this is meaningfully different from what happens in our own brains is not something I would personally be comfortable with.
> Research is revealing these humongous neural networks of weights for next token prediction to exhibit underlying structures that seem to map in some way to a form of knowledge about the world that is
[[citation needed]]
I am sorry but I need exceptionally strong proof of that statement. I think it is totally untrue.
This is your mistake right here. It doesn't think. It's a text generator. It can no more think about what year it is than Swiftkey on your phone "thinks" what year it is when you type
NEXT YEAR WILL BE
and press the middle button.