I agree with you that as stated the analogy is risking dangerous superficiality.
The ‘cognitive’ work of evolution came from the billion years of evolution in the innumerable forms of life that lived, hunted and reproduced through the eons. Effectively we could see evolution-by-natural selection as a something like a simple, highly-parallel, stochastic, slow algorithm. I.e. a simple many-tape random Turing machine taking a very large number of timesteps.
A way to try and maybe put some (vegan) meat on the bones of this analogy would be to look at conditional KT-complexity. KT-complexity is a version of Kolmogorov complexity that also accounts for the time- cost of running the generating program. - In KT-complexity pseudorandomness functions just like randomness.
- Algorithms may indeed be copied and the copy operation is fast and takes very little memory overhead. - Just as in Kolmogorov complexity we rejiggle and think in terms of an algorithmic probability. - a private-public key is trivial in a pure Kolmogorov complexity framework but correctly modelled in a KT-complexity framework.
To deepen the analogy with thermodynamics one should probably carefully read John Wentworth’s generalized heat engines and Kolmogorov sufficient statistics.
To be clear, I am not arguing that evolution is an example of what I’m talking about. The analogy to thermodynamics in what I wrote is straightforwardly correct, no need to introduce KT-complexity and muddy the waters; what I am calling work is literally work.
I agree with you that as stated the analogy is risking dangerous superficiality.
The ‘cognitive’ work of evolution came from the billion years of evolution in the innumerable forms of life that lived, hunted and reproduced through the eons. Effectively we could see evolution-by-natural selection as a something like a simple, highly-parallel, stochastic, slow algorithm. I.e. a simple many-tape random Turing machine taking a very large number of timesteps.
A way to try and maybe put some (vegan) meat on the bones of this analogy would be to look at conditional KT-complexity. KT-complexity is a version of Kolmogorov complexity that also accounts for the time- cost of running the generating program.
- In KT-complexity pseudorandomness functions just like randomness.
- Algorithms may indeed be copied and the copy operation is fast and takes very little memory overhead.
- Just as in Kolmogorov complexity we rejiggle and think in terms of an algorithmic probability.
- a private-public key is trivial in a pure Kolmogorov complexity framework but correctly modelled in a KT-complexity framework.
To deepen the analogy with thermodynamics one should probably carefully read John Wentworth’s generalized heat engines and Kolmogorov sufficient statistics.
To be clear, I am not arguing that evolution is an example of what I’m talking about. The analogy to thermodynamics in what I wrote is straightforwardly correct, no need to introduce KT-complexity and muddy the waters; what I am calling work is literally work.