I do assign a really low prior probability to the existence of lucky socks anywhere
You do realize it might very well mean death to your Bayes score to say or think things like that around an omnipotent being who has a sense of humor, right? This is the sort of Dude Who wrestles with a mortal then names a nation to honor the match just to taunt future wannabe-Platonist Jews about how totally crazy their God is. He is perfectly capable of engineering some lucky socks just so He can make fun of you about it later. He’s that type of Guy. And you do realize that the generalization of Bayes score to decision theoretic contexts with objective morality is actually a direct measure of sinfulness? And that the only reason you’re getting off the hook is that Jesus allegedly managed to have a generalized Bayes score of zero despite being unable to tell a live fig tree from a dead one at a moderate distance and getting all pissed off about it for no immediately discernible reason? Just sayin’, count your blessings.
He is perfectly capable of engineering some lucky socks just so He can make fun of you about it later.
Yes, of course. Why he’d do that, instead of all the other things he could be doing, like creating a lucky hat or sending a prophet to explain the difference between “please don’t be an idiot and quibble over whether it might hurt my feelings if you tell me the truth” and “please be as insulting as possible in your dealings with me”.
And you do realize that the generalization of Bayes score to decision theoretic contexts with objective morality is actually a direct measure of sinfulness?
No, largely because I have no idea what that would even mean. However, if you mean that using good epistemic hygiene is a sin because there’s objective morality, or if you think the objective morality only applies in certain situations which require special epistemology to handle, you’re wrong.
It’s just that now “lucky socks” is the local Schelling point. It’s possible I don’t understand God very well, but I personally am modally afraid of jinxing stuff or setting myself up for dramatic irony. It has to do with how my personal history’s played out. I was mostly just using the socks thing as an example of this larger problem of how epistemology gets harder when there’s a very powerful entity around. I know I have a really hard time predicting the future because I’m used to… “miracles” occurring and helping me out, but I don’t want to take them for granted, but I want to make accurate predictions… And so on. Maybe I’m over-complicating things.
Okay, I can understand that. It can be annoying. However, the standard framework does still apply; you can still use Bayes. It’s like anything else confusing you.
I see what you’re saying and it’s a sensible approximation but I’m not actually sure you can use Bayes in situations with “mutual simulation” like that. Are you familiar with updateless/ambient decision theory perchance?
This post combined with all the comments is perhaps the best place to start, or this post might be an easier introduction to the sorts of problems that Bayes has trouble with. This is the LW wiki hub for decision theory. That said it would take me awhile to explain why I think it’d particularly interest you and how it’s related to things like lucky socks, especially as a lot of the most interesting ideas are still highly speculative. I’d like to write such an explanation at some point but can’t at the moment.
You do realize it might very well mean death to your Bayes score to say or think things like that around an omnipotent being who has a sense of humor, right? This is the sort of Dude Who wrestles with a mortal then names a nation to honor the match just to taunt future wannabe-Platonist Jews about how totally crazy their God is. He is perfectly capable of engineering some lucky socks just so He can make fun of you about it later. He’s that type of Guy. And you do realize that the generalization of Bayes score to decision theoretic contexts with objective morality is actually a direct measure of sinfulness? And that the only reason you’re getting off the hook is that Jesus allegedly managed to have a generalized Bayes score of zero despite being unable to tell a live fig tree from a dead one at a moderate distance and getting all pissed off about it for no immediately discernible reason? Just sayin’, count your blessings.
Yes, of course. Why he’d do that, instead of all the other things he could be doing, like creating a lucky hat or sending a prophet to explain the difference between “please don’t be an idiot and quibble over whether it might hurt my feelings if you tell me the truth” and “please be as insulting as possible in your dealings with me”.
No, largely because I have no idea what that would even mean. However, if you mean that using good epistemic hygiene is a sin because there’s objective morality, or if you think the objective morality only applies in certain situations which require special epistemology to handle, you’re wrong.
It’s just that now “lucky socks” is the local Schelling point. It’s possible I don’t understand God very well, but I personally am modally afraid of jinxing stuff or setting myself up for dramatic irony. It has to do with how my personal history’s played out. I was mostly just using the socks thing as an example of this larger problem of how epistemology gets harder when there’s a very powerful entity around. I know I have a really hard time predicting the future because I’m used to… “miracles” occurring and helping me out, but I don’t want to take them for granted, but I want to make accurate predictions… And so on. Maybe I’m over-complicating things.
Okay, I can understand that. It can be annoying. However, the standard framework does still apply; you can still use Bayes. It’s like anything else confusing you.
I see what you’re saying and it’s a sensible approximation but I’m not actually sure you can use Bayes in situations with “mutual simulation” like that. Are you familiar with updateless/ambient decision theory perchance?
No, I’m not. Should I be? Do you have a link to offer?
This post combined with all the comments is perhaps the best place to start, or this post might be an easier introduction to the sorts of problems that Bayes has trouble with. This is the LW wiki hub for decision theory. That said it would take me awhile to explain why I think it’d particularly interest you and how it’s related to things like lucky socks, especially as a lot of the most interesting ideas are still highly speculative. I’d like to write such an explanation at some point but can’t at the moment.