If you wait for cosmic background radiation to cool down[1], you get much more total computation out of the same matter. The rate of computation doesn’t seem particularly important. The amount of stuff in a Hubble volume might be reducing over time, in which case computing earlier allows more communication with distant galaxies. But given the guess about the effect size of waiting on total compute, computing locally in distant future still buys more total compute than making use of distant galaxies earlier.
I don’t buy the Fermi paradox angle in the paper, obviously the first thing you do is grab all the lightcone you can get your von Neumann probes on, and prepare the matter for storage in a way that’s less wasteful than the random stuff that’s happening in the wild.
That paper is wrong. There are other systems which are not near maximal entropy states and computer generated entropy can be transferred to them adiabatically at a rate of 1 bit of negentropy to erase one bit of error.
As to the post we’re commenting on, the sun probably isn’t the best configuration of matter to use as a power source. But this calculation seems like a reasonable lower bound.
The critique just says that you can get the same advantage even without waiting, while the relevant surprising part of the original claim is that there is a large advantage to be obtained at all, compared to Landauer limit at modern background radiation temperature, so that actually this application of the Landauer limit doesn’t bound available compute.
The part of the paper I appealed to is exploratory engineering, a design that is theoretically possible but not trying to be something worthwhile when it becomes feasible in practice. This gives lower bounds on what’s possible, by sketching particular ways of getting it, not predictions of what’s likely to actually happen. The critique doesn’t seem to take issue with this aspect of the paper.
If you wait for cosmic background radiation to cool down[1], you get much more total computation out of the same matter. The rate of computation doesn’t seem particularly important. The amount of stuff in a Hubble volume might be reducing over time, in which case computing earlier allows more communication with distant galaxies. But given the guess about the effect size of waiting on total compute, computing locally in distant future still buys more total compute than making use of distant galaxies earlier.
I don’t buy the Fermi paradox angle in the paper, obviously the first thing you do is grab all the lightcone you can get your von Neumann probes on, and prepare the matter for storage in a way that’s less wasteful than the random stuff that’s happening in the wild.
That paper is wrong. There are other systems which are not near maximal entropy states and computer generated entropy can be transferred to them adiabatically at a rate of 1 bit of negentropy to erase one bit of error.
As to the post we’re commenting on, the sun probably isn’t the best configuration of matter to use as a power source. But this calculation seems like a reasonable lower bound.
The critique just says that you can get the same advantage even without waiting, while the relevant surprising part of the original claim is that there is a large advantage to be obtained at all, compared to Landauer limit at modern background radiation temperature, so that actually this application of the Landauer limit doesn’t bound available compute.
The part of the paper I appealed to is exploratory engineering, a design that is theoretically possible but not trying to be something worthwhile when it becomes feasible in practice. This gives lower bounds on what’s possible, by sketching particular ways of getting it, not predictions of what’s likely to actually happen. The critique doesn’t seem to take issue with this aspect of the paper.
Ah, thanks. I should’ve noticed that.
Yea you could, but you would be waiting a while. Your reply and 2 others have made me aware that this post’s limit is too low.
[EDIT: spelling]