here’s my prompt for a Rationalist Failed Utopia #4.2
Find the nearest lava bath. And threaten the AI to hang out there for a while if ze doesn’t do what you want.
Could also just commit to psychological self-torture if ze prevents you from going in the bath.
EtA: Actually, it should be fine as the AI has “set guards in the air that prohibit lethal violence, and any damage less than lethal, your body shall repair.”
Although that might bring us back to the problem of communicating what you want to an AI.
here’s my prompt for a Rationalist Failed Utopia #4.2
Find the nearest lava bath. And threaten the AI to hang out there for a while if ze doesn’t do what you want.
Could also just commit to psychological self-torture if ze prevents you from going in the bath.
EtA: Actually, it should be fine as the AI has “set guards in the air that prohibit lethal violence, and any damage less than lethal, your body shall repair.”
Although that might bring us back to the problem of communicating what you want to an AI.