I thought about playing the gatekeeper part and started to imagine tactics that might be used on me. I came up with multiple that might work or at least hurt me. But I think it would be ‘easier’ for me to not let out the AI in real life than in the game (not that I am entirely sure that I couldn’t fail nonetheless). Both is for basically the same reason: Empathy.
As the AI player would quickly find out I am very caring and even the imagination of harm and pain hurts me (I know that this is a weak spot but I also see benefits in it).
Thus one approach that would work on me is that the AI player could induce sufficient horror that I’d want him to stop by letting him out (after all it’s just a game).
This same approach wouldn’t work with a real AI exactly because then it is no game and my horror is balanced by the horror for all of humanity for which I’d happily bear some smaller psychic horror. And then in real life there are more ways to get away from the terminal.
There are other attacks that might work but I will not go in details there.
Note that I definitely wouldn’t recomend myself as a real gatekeeper.
This same approach wouldn’t work with a real AI exactly because then it is no game and my horror is balanced by the horror for all of humanity for which I’d happily bear some smaller psychic horror. And then in real life there are more ways to get away from the terminal.
Interesting. This seems like the main problem is that you don’t really care about winning. So, what if there was some cash (Say, an amount equal to roughly 5% of your monthly income) on the line?
For what amount of cash would you risk your mental balance/mental health?
Everybody has to answer this question. This is a real life question for health care personel, some doctors, prison guards, military personel. Some jobs cost (or risk or offset or dull) you your empathy (or other emotions).
Happy are those who can avoid these parts of human labor. And praise to the courage (or calling) for those who do them.
I guess it’s hard for me to understand this because I view myself immune to mental health harms as a result of horrifying stimuli that I know to be fictional. Even if it’s not fictional, the bulk of my emotions will remain unrecruited unless something I care about is being threatened.
It would take quite a lot cash for me to risk an actual threat to my mental health...like being chronically pumped with LSD for a week, or getting a concussion, or having a variable beeping noise interrupt me every few seconds. But an AI box game would fall on a boring-stimulating spectrum, not a mental damage one.
What if another human’s happiness was on the line? After you’ve given a response to that question, qbrf lbhe bcvavba nobhg gur zbarl dhrfgvba punatr vs V cbvag bhg gung lbh pna qbangr vg naq fvtavsvpnagyl rnfr fbzrbar’f fhssrevat? Fnzr zbargnel nzbhag.
Am am quite possible to flood myself with happiness. I do not need LSD for that. And I assume that it can be as addictive. I assume that I am as able to flood myself with sadness and dread. And I fear the consequences. Thus taking LSD or doing the AI box experiment are not that differnt for me. As I said that is my weak spot.
I thought that the answer to the ‘other person’ question is implied by my post. I’ll bear a lot if other people esp. those I care for are suffering.
After rot13 I better understand your question. You seem to imply that if I bear the AI experiment some funding will go to suffering people. Trading suffering in a utilitarian sense. Interesting. No. That does seem to weigh up.
I thought about playing the gatekeeper part and started to imagine tactics that might be used on me. I came up with multiple that might work or at least hurt me. But I think it would be ‘easier’ for me to not let out the AI in real life than in the game (not that I am entirely sure that I couldn’t fail nonetheless). Both is for basically the same reason: Empathy.
As the AI player would quickly find out I am very caring and even the imagination of harm and pain hurts me (I know that this is a weak spot but I also see benefits in it). Thus one approach that would work on me is that the AI player could induce sufficient horror that I’d want him to stop by letting him out (after all it’s just a game).
This same approach wouldn’t work with a real AI exactly because then it is no game and my horror is balanced by the horror for all of humanity for which I’d happily bear some smaller psychic horror. And then in real life there are more ways to get away from the terminal.
There are other attacks that might work but I will not go in details there.
Note that I definitely wouldn’t recomend myself as a real gatekeeper.
Interesting. This seems like the main problem is that you don’t really care about winning. So, what if there was some cash (Say, an amount equal to roughly 5% of your monthly income) on the line?
For what amount of cash would you risk your mental balance/mental health? Everybody has to answer this question. This is a real life question for health care personel, some doctors, prison guards, military personel.
Some jobs cost (or risk or offset or dull) you your empathy (or other emotions). Happy are those who can avoid these parts of human labor. And praise to the courage (or calling) for those who do them.
I guess it’s hard for me to understand this because I view myself immune to mental health harms as a result of horrifying stimuli that I know to be fictional. Even if it’s not fictional, the bulk of my emotions will remain unrecruited unless something I care about is being threatened.
It would take quite a lot cash for me to risk an actual threat to my mental health...like being chronically pumped with LSD for a week, or getting a concussion, or having a variable beeping noise interrupt me every few seconds. But an AI box game would fall on a boring-stimulating spectrum, not a mental damage one.
What if another human’s happiness was on the line? After you’ve given a response to that question, qbrf lbhe bcvavba nobhg gur zbarl dhrfgvba punatr vs V cbvag bhg gung lbh pna qbangr vg naq fvtavsvpnagyl rnfr fbzrbar’f fhssrevat? Fnzr zbargnel nzbhag.
Am am quite possible to flood myself with happiness. I do not need LSD for that. And I assume that it can be as addictive. I assume that I am as able to flood myself with sadness and dread. And I fear the consequences. Thus taking LSD or doing the AI box experiment are not that differnt for me. As I said that is my weak spot.
I thought that the answer to the ‘other person’ question is implied by my post. I’ll bear a lot if other people esp. those I care for are suffering. After rot13 I better understand your question. You seem to imply that if I bear the AI experiment some funding will go to suffering people. Trading suffering in a utilitarian sense. Interesting. No. That does seem to weigh up.
So Yes. I care more for my mental balance than for the small monetary reward.