I’m not quite sure what the point of all of this is…
You’ve decided you want to be able to define what a god’s eyes probability for something would be, and indeed come up with what (at least initially) seems like a reasonable definition. But why should I want to define such a thing in the first place, if, as you yourself admit, it isn’t actually useful for anything?
I often talk about the “true probability” of something (e.g. AGI by 2040). When asked what I mean, I generally say something like “the probability I would have if I had perfect knowledge and unlimited computation”—but that isn’t quite right, because if I had truly perfect knowledge and unlimited computation I would be able to resolve the probability to either 0 or 1. Perfect knowledge and computation within reason, I guess? But that’s kind of hand-wavey. What I’ve actually been meaning is the butterfly probability, and I’m glad this concept/post now exists for me to reference!
More generally I’d say it’s useful to make intuitive concepts more precise, even if it’s hard to actually use the definition, in the same way that I’m glad logical induction has been formalized despite being intractable. Also I’d say that this is an interesting concept, regardless of whether it’s useful :)
“My probability is 30%, and I’m 50% sure that the butterfly probability is between 20% and 40%” carries useful information, for example. It tells people how confident I am in my probability.
I’m not quite sure what the point of all of this is… You’ve decided you want to be able to define what a god’s eyes probability for something would be, and indeed come up with what (at least initially) seems like a reasonable definition. But why should I want to define such a thing in the first place, if, as you yourself admit, it isn’t actually useful for anything?
Bayesianism and frequentism both have their limitations.
I often talk about the “true probability” of something (e.g. AGI by 2040). When asked what I mean, I generally say something like “the probability I would have if I had perfect knowledge and unlimited computation”—but that isn’t quite right, because if I had truly perfect knowledge and unlimited computation I would be able to resolve the probability to either 0 or 1. Perfect knowledge and computation within reason, I guess? But that’s kind of hand-wavey. What I’ve actually been meaning is the butterfly probability, and I’m glad this concept/post now exists for me to reference!
More generally I’d say it’s useful to make intuitive concepts more precise, even if it’s hard to actually use the definition, in the same way that I’m glad logical induction has been formalized despite being intractable. Also I’d say that this is an interesting concept, regardless of whether it’s useful :)
How would you ever know what the butterfly probability of something is, such that it would make sense to refer to it? In what context is it useful?
“My probability is 30%, and I’m 50% sure that the butterfly probability is between 20% and 40%” carries useful information, for example. It tells people how confident I am in my probability.