That’s true, but there is an interesting way in which the uncertainty principle is still about uncertainty.
It turns out that for any functions f and g related by a Fourier transform, and where |f|^2 and |g|^2 are probability distributions, the sum of these distributions’ Shannon entropy H is bounded below: H(|f|2)+H(|g|2)≥loge2 And this implies the traditional Heisenberg Uncertainty principle!
Yes, I’m a big fan of the Entropic Uncertainty Principle. One thing to note about it is that the definition of entropy only uses the measure space structure of the reals, whereas the definition of variance also uses the metric on the reals as well. So Heisenberg’s principle uses more structure to say less stuff. And it’s not like the extra structure is merely redundant either. You can say useful stuff using the metric structure, like Hardy’s Uncertainty Principle. So Heisenberg’s version is taking useful information and then just throwing it away.
I’d almost support teaching the Entropic Uncertainty Principle instead of Heisenberg’s to students first learning quantum theory. But unfortunately its proof is much harder. And students are generally more familiar with variance than entropy.
With regards to Eliezer’s original point, the distibutions |f|^2 and |g|^2 don’t actually describe our uncertainty about anything. We have perfect knowledge of the wavefunction; there is no uncertainty. I suppose you could say that H(|f|^2) and H(|g|^2) quantify the uncertainty you would have if you were to measure the position and momentum (in Eliezer’s point of view this would be indexical uncertainty about which world we were in), although you can’t perform both of these measurements on the same particle.
That’s true, but there is an interesting way in which the uncertainty principle is still about uncertainty.
It turns out that for any functions f and g related by a Fourier transform, and where |f|^2 and |g|^2 are probability distributions, the sum of these distributions’ Shannon entropy H is bounded below:
H(|f|2)+H(|g|2)≥loge2
And this implies the traditional Heisenberg Uncertainty principle!
Yes, I’m a big fan of the Entropic Uncertainty Principle. One thing to note about it is that the definition of entropy only uses the measure space structure of the reals, whereas the definition of variance also uses the metric on the reals as well. So Heisenberg’s principle uses more structure to say less stuff. And it’s not like the extra structure is merely redundant either. You can say useful stuff using the metric structure, like Hardy’s Uncertainty Principle. So Heisenberg’s version is taking useful information and then just throwing it away.
I’d almost support teaching the Entropic Uncertainty Principle instead of Heisenberg’s to students first learning quantum theory. But unfortunately its proof is much harder. And students are generally more familiar with variance than entropy.
With regards to Eliezer’s original point, the distibutions |f|^2 and |g|^2 don’t actually describe our uncertainty about anything. We have perfect knowledge of the wavefunction; there is no uncertainty. I suppose you could say that H(|f|^2) and H(|g|^2) quantify the uncertainty you would have if you were to measure the position and momentum (in Eliezer’s point of view this would be indexical uncertainty about which world we were in), although you can’t perform both of these measurements on the same particle.