I think I am happy with how these rules interact with the Anthropic Trilemma problem. But as a simpler test case, consider the following:
An AI walks into a movie theater. “In exchange for 10 utilons worth of cash”, says the owner, “I will show you a movie worth 100 utilons. But we have a special offer: for only 1000 utilons worth of cash, I will clone you ten thousand times, and every copy of you will see that same movie. At the end of the show, since every copy will have had the same experience, I’ll merge all the copies of you back into one.”
Note that, although AIs can be cloned, cash cannot be. ^_^;
I claim that a “sane” AI is one that declines the special offer.
I think I am happy with how these rules interact with the Anthropic Trilemma problem. But as a simpler test case, consider the following:
An AI walks into a movie theater. “In exchange for 10 utilons worth of cash”, says the owner, “I will show you a movie worth 100 utilons. But we have a special offer: for only 1000 utilons worth of cash, I will clone you ten thousand times, and every copy of you will see that same movie. At the end of the show, since every copy will have had the same experience, I’ll merge all the copies of you back into one.”
Note that, although AIs can be cloned, cash cannot be. ^_^;
I claim that a “sane” AI is one that declines the special offer.