Hmm. I was unable to distinguish between this person and a real person, and one way to perfectly simulate a human being (given the apparently endless resource these simulators have) would be to create something thats very close to a human being. So I’d be inclined not to repeat the experiment. The benefit if I’m right is $100, if I’m wrong I cause massive amounts of pain. Of course saying that means that this machine can start mugging me by forcing me to pay money to prevent the pain, which leads me into a question about kidnapping, basically: I should pre-commit to not press the button to encourage AI not to torture humans/simulcrum humans who I can’t distinguish from.
Hmm. I was unable to distinguish between this person and a real person, and one way to perfectly simulate a human being (given the apparently endless resource these simulators have) would be to create something thats very close to a human being. So I’d be inclined not to repeat the experiment. The benefit if I’m right is $100, if I’m wrong I cause massive amounts of pain. Of course saying that means that this machine can start mugging me by forcing me to pay money to prevent the pain, which leads me into a question about kidnapping, basically: I should pre-commit to not press the button to encourage AI not to torture humans/simulcrum humans who I can’t distinguish from.