I remember reading an article a while back and it went something like this. They had a digital circuit which implemented some function (might have been a filter, but no 100% sure). They let engineers design the circuit to perform the function. Then they let a computer iteratively (ML?) place the components to achieve that same function. It got so good that it was able to use less gates but some how worked just as well.
They were really confused and finally realized that unbeknownst to them the computer had taken advantage of the fact a moving charge creates a magnetic field and vice versa. This fact was exploited by the computer (inadvertently, lol).
"Dr. Thompson peered inside his perfect offspring to gain insight into its methods, but what he found inside was baffling.
The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops.
Five individual logic cells were functionally disconnected from the rest— with no pathways that would allow them to influence the output— yet when the researcher disabled any one of them the chip lost its ability to discriminate the tones.
Furthermore, the final program did not work reliably when it was loaded onto other FPGAs of the same type."
Probably “An evolved circuit, intrinsic in silicon, entwined with physics” [1]. They let a genetic algorithm come up with a better FPGA circuit, but it no longer worked when non-contributing parts of the circuit were pruned.
They were really confused and finally realized that unbeknownst to them the computer had taken advantage of the fact a moving charge creates a magnetic field and vice versa. This fact was exploited by the computer (inadvertently, lol).