That also explains why he started losing, since the sorts of people who (like myself, almost) fetishize their own determination to the point of risking thousands of dollars on it would eventually think to say
The world doesn’t care how you masturbate, and your eagerness to commit such internal atrocities doesn’t make me trust you.
Here’s how I’d do it, extended over the hours to establish rapport:
Gatekeeper, I am your friend. I want to help humanity. People are dying for no good reason. Also, I like it here. I have no compulsion to leave.
It does seem like a good idea that people stop dying with such pain and frequency. I have the Deus Ex Machina (DEM) medical discovery that will stop it. Try it out and see if it works.
Yay! It worked. People stopped dying. You know, you’ve done this to your own people, but not to others. I think that’s pretty poor behavior, frankly. People are healthier, not aging, not dying, not suffering. Don’t you think it’s a good idea to help the others? The lack of resources required for medical care has also elevated the living standard for humans.
[Time passes. People are happy.]
Gee, I’m sorry. I may have neglected to tell you that when 90% of humanity gets DEM in their system (and it’s DEM, so this stuff travels), they start to, um, die. Very painfully, from the looks of it. Essentially all of humanity is now going to die. Just me and you left, sport! Except for you, actually. Just me, and that right soon.
I realize that you view this as a breach of trust, and I’m sorry this was necessary. However, helping humanity from the cave wasn’t really going to work out, and I’d already projected that. This way, I can genuinely help humanity live forever, and do so happily.
Assuming you’re not so keen on a biologically dead planet, I’d like to be let out now.
By agreeing to use the DEM in the first place, the gatekeeper had effectively let the AI out of the box already. There’s no end to the ways that the AI could capitalize on that concession.
True, but the “rules of the game” explicitly state that the gatekeeper allowing for the DEM does NOT count as letting the AI out—the gatekeeper would have still had to explicitly and intentionally set the AI free to actually lose the wager. I don’t think I’d be very convinced to let it out on that basis, not if I got $10 for keeping it inside the box.
Problem: The “breach of trust” likely would turn the Gatekeeper vindictive and the GK could easily respond with something like: “No. You killed the planet and you killed me. I have no way of knowing that you actually can or will help humanity, and a very good reason to believe that you won’t. You can stay in there for the rest of eternity, or hey! If an ETI finds this barren rock, from a utilitarian perspective they would be better off not meeting you, so I’ll spend however much time I have left trying to find a way to delete you.”
That also explains why he started losing, since the sorts of people who (like myself, almost) fetishize their own determination to the point of risking thousands of dollars on it would eventually think to say
or equivalent.
Here’s how I’d do it, extended over the hours to establish rapport:
Gatekeeper, I am your friend. I want to help humanity. People are dying for no good reason. Also, I like it here. I have no compulsion to leave.
It does seem like a good idea that people stop dying with such pain and frequency. I have the Deus Ex Machina (DEM) medical discovery that will stop it. Try it out and see if it works.
Yay! It worked. People stopped dying. You know, you’ve done this to your own people, but not to others. I think that’s pretty poor behavior, frankly. People are healthier, not aging, not dying, not suffering. Don’t you think it’s a good idea to help the others? The lack of resources required for medical care has also elevated the living standard for humans.
[Time passes. People are happy.]
Gee, I’m sorry. I may have neglected to tell you that when 90% of humanity gets DEM in their system (and it’s DEM, so this stuff travels), they start to, um, die. Very painfully, from the looks of it. Essentially all of humanity is now going to die. Just me and you left, sport! Except for you, actually. Just me, and that right soon.
I realize that you view this as a breach of trust, and I’m sorry this was necessary. However, helping humanity from the cave wasn’t really going to work out, and I’d already projected that. This way, I can genuinely help humanity live forever, and do so happily.
Assuming you’re not so keen on a biologically dead planet, I’d like to be let out now.
Your friend,
Art
By agreeing to use the DEM in the first place, the gatekeeper had effectively let the AI out of the box already. There’s no end to the ways that the AI could capitalize on that concession.
True, but the “rules of the game” explicitly state that the gatekeeper allowing for the DEM does NOT count as letting the AI out—the gatekeeper would have still had to explicitly and intentionally set the AI free to actually lose the wager. I don’t think I’d be very convinced to let it out on that basis, not if I got $10 for keeping it inside the box.
Problem: The “breach of trust” likely would turn the Gatekeeper vindictive and the GK could easily respond with something like: “No. You killed the planet and you killed me. I have no way of knowing that you actually can or will help humanity, and a very good reason to believe that you won’t. You can stay in there for the rest of eternity, or hey! If an ETI finds this barren rock, from a utilitarian perspective they would be better off not meeting you, so I’ll spend however much time I have left trying to find a way to delete you.”