I like the thought experiment and it seems like an interesting way to split a bunch of philosophical models.
One thing I’m curious about is which (the small three or the big one) you would prefer to be, and whether that preference should factor into your beliefs here.
Generally preference should only affect our beliefs when we’re deciding which self fulfilling prophesies to promote. (Hmm or when deciding which branches of thoughts and beliefs to examine or not, which might be equally salient in real bounded agents like us)
The weird thing here is how anthropics might interact w/ the organization of the universe. Part of this is TDT-ish.
For example, if you were to construct this scenario (one big brain, three smaller brains) and you had control over which one each would prefer to be, how would you line up their preferences?
Given no other controls, I’d probably construct them such as they would prefer to be the construction they are.
So, it seems worth considering (in a very hand wavy and weak in terms of actual evidence) prior is that in worlds where I expect things like myself are setting up experiments like this, I’m slightly more likely to be an instance of the one I would have the preference of being.
I like the thought experiment and it seems like an interesting way to split a bunch of philosophical models.
One thing I’m curious about is which (the small three or the big one) you would prefer to be, and whether that preference should factor into your beliefs here.
Generally preference should only affect our beliefs when we’re deciding which self fulfilling prophesies to promote. (Hmm or when deciding which branches of thoughts and beliefs to examine or not, which might be equally salient in real bounded agents like us)
I can’t see any there? What’s your hunch?
The weird thing here is how anthropics might interact w/ the organization of the universe. Part of this is TDT-ish.
For example, if you were to construct this scenario (one big brain, three smaller brains) and you had control over which one each would prefer to be, how would you line up their preferences?
Given no other controls, I’d probably construct them such as they would prefer to be the construction they are.
So, it seems worth considering (in a very hand wavy and weak in terms of actual evidence) prior is that in worlds where I expect things like myself are setting up experiments like this, I’m slightly more likely to be an instance of the one I would have the preference of being.