Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, and not even close. Text cannot model many things. Try learning surgery using a textbook without pictures. Try driving a racecar competitively or learning a foreign language after only reading a book, even one with pictures.

Models of all kinds are always poor surrogates for reality, and models that cannot employ logic or causality CLEARLY cannot model a world in which mechanisms cause all change.

Statistical models can indeed describe many observations, but can never break out from the echo chamber of copy-catting patterns it has already seen. If a probabilistic engine like a deep net hasn’t been exposed to a concept during training, it will never induce its existence. Imagination requires initiative, the proposal of an unknown and unfamiliar outcome via logical inference or causal induction. Until deep nets can employ both of these skills, they will never master many skills humans use routinely to explain the world or extend our understanding of how it works.



If logic helps you predict the next word in the text, then a text based model will learn logic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: