AI "thinks" like a piece of rope in a dryer "thinks" in order to come to an advanced knot: a whole lot of random jumbling that eventually leads to a complex outcome.
I regularly see this but I feel like it's disingenuous. Akin to saying "if we simulate enough monkies on a typewriter, we'll eventually get the right result"