I agree about Engines of Cognition. It got me really interested in the parallels between information theory and thermodynamics and led me to start reading a lot more about the former, including the classic Jaynes papers. I think it gave me a deeper understanding of why e.g. the Carnot limit holds, and let me to read about the interesting discovery that the thermodynamic availability (extractable work) of a system is equal to its Kullback-Leibler divergence (a generalization of informational entropy) from its environment.
Second for me would have to be Artificial Addition, which helped me understand why attempts to “trick” a system into displaying intelligence are fundamentally misguided.
The biggest “aha” post was probably the one linking thermodynamics to beliefs ( The Second Law of Thermodynamics, and Engines of Cognition, and the following one, Perpetual Motion Beliefs ), because it linked two subjects I knew about in a surprising and interesting way, deepening my understanding of both.
Apart from that, “Tsuyoku Naritai” was the one that got me hooked, though I didn’t really “learn” anything by it—I like the attitude it portrays.
I agree about Engines of Cognition. It got me really interested in the parallels between information theory and thermodynamics and led me to start reading a lot more about the former, including the classic Jaynes papers. I think it gave me a deeper understanding of why e.g. the Carnot limit holds, and let me to read about the interesting discovery that the thermodynamic availability (extractable work) of a system is equal to its Kullback-Leibler divergence (a generalization of informational entropy) from its environment.
Second for me would have to be Artificial Addition, which helped me understand why attempts to “trick” a system into displaying intelligence are fundamentally misguided.