Does that even help? The “expected physical frequency” of the 100th digit of pi is the same as the EPF of the 16th digit of the mass of a given object, but my expectation of 100 independent samples of the first is drastically different from the same for the second.
My “in this context” meant in the context of: “For example, any event with any non-zero probability of happening, no matter how large the negative exponent, would be assured of actually happening an infinite amount of times somewhere in the our very own universe”
The term “probability” doesn’t always refer to expected physical frequencies.
Expected physical frequency always matches probability. This specific context has the same problem. If I believe that superintelligence is possible at 50% confidence, it does not follow that on 50% of candidate worlds, superintelligence develops, and on the others, it doesn’t.
Well, of course it doesn’t necessarily follow. The fact that a resource-limited agent’s estimates sometimes prove inaccurate doesn’t imply that their estimates are not probabilities. E.g. see here:
Probability is a measure of the expectation that an event will occur or a statement is true. Probabilities are given a value between 0 (will not occur) and 1 (will occur). The higher the probability of an event, the more certain we are that the event will occur.
Does that even help? The “expected physical frequency” of the 100th digit of pi is the same as the EPF of the 16th digit of the mass of a given object, but my expectation of 100 independent samples of the first is drastically different from the same for the second.
My “in this context” meant in the context of: “For example, any event with any non-zero probability of happening, no matter how large the negative exponent, would be assured of actually happening an infinite amount of times somewhere in the our very own universe”
The term “probability” doesn’t always refer to expected physical frequencies.
(Edited)
Expected physical frequency always matches probability. This specific context has the same problem. If I believe that superintelligence is possible at 50% confidence, it does not follow that on 50% of candidate worlds, superintelligence develops, and on the others, it doesn’t.
Well, of course it doesn’t necessarily follow. The fact that a resource-limited agent’s estimates sometimes prove inaccurate doesn’t imply that their estimates are not probabilities. E.g. see here:
Yes sorry, I fucked up the wording/thinking there. EPF always matches probability, and estimates are always probabilities.
Edited the parent to have clearer wording.