Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

AI "thinks" like a piece of rope in a dryer "thinks" in order to come to an advanced knot: a whole lot of random jumbling that eventually leads to a complex outcome.


Ah yes, I too often have extended English conversation with my washed rope.


I regularly see this but I feel like it's disingenuous. Akin to saying "if we simulate enough monkies on a typewriter, we'll eventually get the right result"


If we could motivate the monkies sufficiently with bananas, we'd probably improve those odds substantially.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: