Hmm. I’ll have to take a closer look at that. You mean that the uncertainties are correlated, right?
and we will have renormalization parameters k’m and
Can you show where you got that? My impression was that once we got to the set of (equivalent, only difference is scale) utility functions, averaging them just works without room for more fine-tuning.
But as I said, that part is shaky because I haven’t actually supported those intuitions with any particular assumptions. We’ll see what happens when we build it up from more solid ideas.
Hmm. I’ll have to take a closer look at that. You mean that the uncertainties are correlated, right?
No. To quote your own post:
A similar process allows us to arbitrarily set exactly one of the km.
I meant that the utility function resulting from averaging over your uncertainty over the km’s will depend on which km you chose to arbitrarily set in this way. I gave an example of this phenomenon in my original comment.
Hmm. I’ll have to take a closer look at that. You mean that the uncertainties are correlated, right?
Can you show where you got that? My impression was that once we got to the set of (equivalent, only difference is scale) utility functions, averaging them just works without room for more fine-tuning.
But as I said, that part is shaky because I haven’t actually supported those intuitions with any particular assumptions. We’ll see what happens when we build it up from more solid ideas.
No. To quote your own post:
I meant that the utility function resulting from averaging over your uncertainty over the km’s will depend on which km you chose to arbitrarily set in this way. I gave an example of this phenomenon in my original comment.
Oh sorry. I get what you mean now. Thanks.
I’ll have to think about that and see where the mistake is. That’s pretty serious, though.