more recent study found that a slight majority of people would prefer to remain in the simulation.
I believe lukeprog was talking about what people think before they get wireheaded. It’s very probable that once one gets hooked to that machine, one changes ones mind—based on new experience.
It’s certainly true for rats which could not stop hitting the ‘pleasure’ button, and died of starvation.
This is also why people have that status quo bias—no one wants to die of starving, even with ‘pleasure’ button.
Isn’t there a rule of Bayesianism that you shouldn’t be able to anticipate changing your mind in a predictable manner, but rather you should just update right now?
Perhaps rather than asking will you enter or leave the simulation it might be better to start with a person inside it, remove them from it, and then ask them if they want to go back.
Isn’t there a rule of Bayesianism that you shouldn’t be able to anticipate changing your mind in a predictable manner, but rather you should just update right now?
Changing your mind based on evidence and experiences are different. I am confident that if I eat a meal, my hunger will decrease. Does that mean I should update my hunger downward now without eating?
I can believe “If I wireheaded I would want to continue wireheading” and “I currently don’t want to wirehead” without contradiction and without much pressure to want to wirehead.
Changing your mind based on evidence and experiences are different. I am confident that if I eat a meal, my hunger will decrease. Does that mean I should update my hunger downward now without eating?
One’s hunger isn’t really an idea of the mind that one can change, yeah? I’d say that “changing your mind” (at least regarding particular ideas and beliefs) is different than “changing a body’s immediate reaction to a physical state” (like lacking nourishment: hunger).
If you conducted brain surgery on me I might want different things. I should not want those things now—indeed, I could not, since there are multiple possible surgeries.
“Wireheading” explicitly refers to a type of brain surgery, involving sticking wires in ones head. Some versions of it may not be surgical, but the point stands.
This is also why people have that status quo bias—no one wants to die of starving, even with ‘pleasure’ button.
It was my understanding that the hypothetical scenario ruled this out (hence the abnormally long lifespan).
In any event, an FAI would want to maximize its utility, so making its utility contingent on the amount of pleasure going on it seems probable that it would want to make as many humans as possible and make them live as long as possible in a wirehead simulation.
I believe lukeprog was talking about what people think before they get wireheaded. It’s very probable that once one gets hooked to that machine, one changes ones mind—based on new experience.
It’s certainly true for rats which could not stop hitting the ‘pleasure’ button, and died of starvation.
This is also why people have that status quo bias—no one wants to die of starving, even with ‘pleasure’ button.
Isn’t there a rule of Bayesianism that you shouldn’t be able to anticipate changing your mind in a predictable manner, but rather you should just update right now?
Perhaps rather than asking will you enter or leave the simulation it might be better to start with a person inside it, remove them from it, and then ask them if they want to go back.
Changing your mind based on evidence and experiences are different. I am confident that if I eat a meal, my hunger will decrease. Does that mean I should update my hunger downward now without eating?
I can believe “If I wireheaded I would want to continue wireheading” and “I currently don’t want to wirehead” without contradiction and without much pressure to want to wirehead.
One’s hunger isn’t really an idea of the mind that one can change, yeah? I’d say that “changing your mind” (at least regarding particular ideas and beliefs) is different than “changing a body’s immediate reaction to a physical state” (like lacking nourishment: hunger).
If you conducted brain surgery on me I might want different things. I should not want those things now—indeed, I could not, since there are multiple possible surgeries.
“Wireheading” explicitly refers to a type of brain surgery, involving sticking wires in ones head. Some versions of it may not be surgical, but the point stands.
I think we’re talking about an experience machine, not a pleasure button.
It was my understanding that the hypothetical scenario ruled this out (hence the abnormally long lifespan).
In any event, an FAI would want to maximize its utility, so making its utility contingent on the amount of pleasure going on it seems probable that it would want to make as many humans as possible and make them live as long as possible in a wirehead simulation.