I agree about Engines of Cognition. It got me really interested in the parallels between information theory and thermodynamics and led me to start reading a lot more about the former, including the classic Jaynes papers. I think it gave me a deeper understanding of why e.g. the Carnot limit holds, and let me to read about the interesting discovery that the thermodynamic availability (extractable work) of a system is equal to its Kullback-Leibler divergence (a generalization of informational entropy) from its environment.
Second for me would have to be Artificial Addition, which helped me understand why attempts to “trick” a system into displaying intelligence are fundamentally misguided.
I agree about Engines of Cognition. It got me really interested in the parallels between information theory and thermodynamics and led me to start reading a lot more about the former, including the classic Jaynes papers. I think it gave me a deeper understanding of why e.g. the Carnot limit holds, and let me to read about the interesting discovery that the thermodynamic availability (extractable work) of a system is equal to its Kullback-Leibler divergence (a generalization of informational entropy) from its environment.
Second for me would have to be Artificial Addition, which helped me understand why attempts to “trick” a system into displaying intelligence are fundamentally misguided.