It’s just that now “lucky socks” is the local Schelling point. It’s possible I don’t understand God very well, but I personally am modally afraid of jinxing stuff or setting myself up for dramatic irony. It has to do with how my personal history’s played out. I was mostly just using the socks thing as an example of this larger problem of how epistemology gets harder when there’s a very powerful entity around. I know I have a really hard time predicting the future because I’m used to… “miracles” occurring and helping me out, but I don’t want to take them for granted, but I want to make accurate predictions… And so on. Maybe I’m over-complicating things.
Okay, I can understand that. It can be annoying. However, the standard framework does still apply; you can still use Bayes. It’s like anything else confusing you.
I see what you’re saying and it’s a sensible approximation but I’m not actually sure you can use Bayes in situations with “mutual simulation” like that. Are you familiar with updateless/ambient decision theory perchance?
This post combined with all the comments is perhaps the best place to start, or this post might be an easier introduction to the sorts of problems that Bayes has trouble with. This is the LW wiki hub for decision theory. That said it would take me awhile to explain why I think it’d particularly interest you and how it’s related to things like lucky socks, especially as a lot of the most interesting ideas are still highly speculative. I’d like to write such an explanation at some point but can’t at the moment.
It’s just that now “lucky socks” is the local Schelling point. It’s possible I don’t understand God very well, but I personally am modally afraid of jinxing stuff or setting myself up for dramatic irony. It has to do with how my personal history’s played out. I was mostly just using the socks thing as an example of this larger problem of how epistemology gets harder when there’s a very powerful entity around. I know I have a really hard time predicting the future because I’m used to… “miracles” occurring and helping me out, but I don’t want to take them for granted, but I want to make accurate predictions… And so on. Maybe I’m over-complicating things.
Okay, I can understand that. It can be annoying. However, the standard framework does still apply; you can still use Bayes. It’s like anything else confusing you.
I see what you’re saying and it’s a sensible approximation but I’m not actually sure you can use Bayes in situations with “mutual simulation” like that. Are you familiar with updateless/ambient decision theory perchance?
No, I’m not. Should I be? Do you have a link to offer?
This post combined with all the comments is perhaps the best place to start, or this post might be an easier introduction to the sorts of problems that Bayes has trouble with. This is the LW wiki hub for decision theory. That said it would take me awhile to explain why I think it’d particularly interest you and how it’s related to things like lucky socks, especially as a lot of the most interesting ideas are still highly speculative. I’d like to write such an explanation at some point but can’t at the moment.