That is sort of an illogical question. What it boils down to is “Is your goal to feel like you are helping somebody, or is your goal to help somebody whom you are actually emotionally attached to?” If I agree with the former, then aside from realizing I have some pretty vapid and pointless goals, I’d get in the machine. But if I had the clarity to realize my part in that goal system, helping others probably wouldn’t be high on my list of things to do. I would get in without a glance backward, and start thinking up something interesting. If I genuinely believed that I cared about them and got in the box anyway, do I really qualify as a sentient being? The machine might as well be a meat-grinder in that case.
If I genuinely cared about the person in question, I would realize that with me inside the machine, he would still be suffering, and my social programming would not easily allow me to deviate from the “right thing to do”, and I would refuse to get in.
That is sort of an illogical question. What it boils down to is “Is your goal to feel like you are helping somebody, or is your goal to help somebody whom you are actually emotionally attached to?” If I agree with the former, then aside from realizing I have some pretty vapid and pointless goals, I’d get in the machine. But if I had the clarity to realize my part in that goal system, helping others probably wouldn’t be high on my list of things to do. I would get in without a glance backward, and start thinking up something interesting. If I genuinely believed that I cared about them and got in the box anyway, do I really qualify as a sentient being? The machine might as well be a meat-grinder in that case.
If I genuinely cared about the person in question, I would realize that with me inside the machine, he would still be suffering, and my social programming would not easily allow me to deviate from the “right thing to do”, and I would refuse to get in.