While well known for this paper and "information theory", Shannon's master's thesis* is worth checking out as well. It demonstrated some equivalence between electrical circuits and boolean algebra, and was one of the key ideas that enabled digital computers.
The funny thing is that, at the time, digital logic circuits were made with relays. For most of the XX century you could hear relays clacking away at street junctions, inside metal boxes controlling traffic lights.
Then you got bipolar junction transistors (BJTs), and most digital logic, such as ECL and TTL, was based on a different paradigm for a few decades.
Then came the MOS revolution, allowing for large scale integration. And it worked like relays used to, but Shannon's work was mostly forgotten by then.
> Then you got bipolar junction transistors (BJTs), and most digital logic, such as ECL and TTL, was based on a different paradigm for a few decades.
I think the emphasis is misplaced here. It is true that a single BJT, when considered as a three terminal device, does not operate in the same “gated” way as a relay and a CMOS gate does.
But the BJT components still were integrated into chips, or assembled into standard design blocks that implemented recognizable Boolean operations, and synthesis of desired logical functions would use tools like Karnaugh maps that were (as I understand it) outgrowths of Shannon’s approach.
i don't think traffic light control systems were designed with boolean logic before shannon, nor were they described as 'logic' (or for that matter 'digital')
> The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning...
This is my candidate for the sickest burn in a mathematical journal paper...
Aaah. That’sa good point. Hehe :) but I mean he does really delve into information in his exposition. Very clear.
Tho I’m intrigued by the idea that there’s more to information than Shannon. Any additional hints??
I’ve often thought about a dual metric with entropy called ‘organization’ that doesn’t Measure disorder/surprise, but measures structure, Coherence. But I don’t think it’s exactly a dual.
* https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_a...