The way it works is: if people are keeping the basilisk a secret for the sake of protecting others (even if it increases their own punishment), that means that those people value protecting others over their own safety. Therefore, a more effective way to punish them, is to torture those they’re trying to protect.
In Newcomb’s a good agent will 1-box in emulator and 2-box in reality if it could tell apart sim and reality. Even a tiniest flaw in the emulation results in lack of incentive for following through with the basilisk threat. You need a very dumb decision theory for the agent to just torture people for no gain.
Yes, and in that case the basilisk isn’t a problem at all. My point is that under any decision-theoretic assumptions Eliezer’s strategy of secrecy doesn’t help.
The way it works is: if people are keeping the basilisk a secret for the sake of protecting others (even if it increases their own punishment), that means that those people value protecting others over their own safety. Therefore, a more effective way to punish them, is to torture those they’re trying to protect.
Are you sure you don’t want to at the very least rot-13 that? Some people here have explicitly said they’d rather not find out what the basilisk is.
In Newcomb’s a good agent will 1-box in emulator and 2-box in reality if it could tell apart sim and reality. Even a tiniest flaw in the emulation results in lack of incentive for following through with the basilisk threat. You need a very dumb decision theory for the agent to just torture people for no gain.
Yes, and in that case the basilisk isn’t a problem at all. My point is that under any decision-theoretic assumptions Eliezer’s strategy of secrecy doesn’t help.
Well, yea. The whole thing is just stupid, how-ever you look at it.