You need to specify what happens if you decline the offer. Right now it looks as if you and Frank both die of dehydration after couple of days. Or you go insane and one kills the other (and maybe eats him). And then dies anyway. In order for this to be a dilemma, the baseline outcome needs to be more… wholesome.
Also, the temptation isn’t very tempting. An ornate chandelier? I could get some value from the novelty from seeing it and maybe staring at it for several hours if it’s really ornate. It’s status as a super-luxury good would be worthless in the absence of a social hierarchy. I couldn’t trade or give away gazillions of them so multiplying wouldn’t add anything.
I suppose the nanofab can manufacture novelty (though it isn’t quite clear from your description). But it won’t make minds. This is a problem. Humans are quite big on belonging to a society. I can’t imagine what being an immortal god of solipsistic matrix would feel like but I suspect it could be horrible.
The prohibition against creating minds isn’t very clear as we don’t have a clear idea on what constitutes a mind. Maybe I could ask Omega to generate the best possible RPG game with an entire simulated world and super-realistic NPCs? Would that be allowed? I don’t know if a sufficiently high-fidelity simulation of a person isn’t an actual person. And there would be at least one mind—me. Could I self-modify, grow my sense of empathy to epic proportions and start imagining people into being? And then, to fix my past sins, I’d order a book “Everything You Could Ever Ask About Frank” or something.
I think we should steelman this by stipulating that if you don’t take the trade, neither you nor Frank will die any time soon. You will both live out a normal human lifespan, just a very dull one.
It gets even more interesting if Frank is an immortal in this scenario, and the only way to get the machine is to make him mortal, perhaps with some small probability epsilon. How small does epsilon have to be before you (or Frank) will agree to such a trade?
This is basically what I intended with the White Room: make things as simple as possible.
Ironically, this may require a statement that you and Frank will return to the real world after this trade… (except I can’t do that because then the obvious solution is “take the nanofab, go make Hourai Elixirs for everyone, ω^2 utility beats ω.” Argh.)
… Ehhhh… I think I’m going to have to expand Unseelie’s job here. In general, the nanofab is capable of creating anything you want that’s secularly interesting (so, yes, you can have your eternally fun RPG game, though the NPCs aren’t going to pass the Turing test), but no method of resurrecting Frank, or creating another intelligence, can work.
Unseelie has to be more powerful than that; Emile pointed out that I could just simulate a mind with enough rocks (or Sofa Cushions). Unseelie also has to make sure my mind is never powerful enough to simulate another mind. That involves either changing me or preventing me from self-improving, so self-improvement is probably disallowed or severely limited if we keep the prohibition on Unseelie changing me.
I think the easiest way to steelman the loneliness problem presented by the given scenario is to just have a third person, let’s say Jane, who stays around regardless of whether you kill Frank or not.
You need to specify what happens if you decline the offer. Right now it looks as if you and Frank both die of dehydration after couple of days. Or you go insane and one kills the other (and maybe eats him). And then dies anyway. In order for this to be a dilemma, the baseline outcome needs to be more… wholesome.
Also, the temptation isn’t very tempting. An ornate chandelier? I could get some value from the novelty from seeing it and maybe staring at it for several hours if it’s really ornate. It’s status as a super-luxury good would be worthless in the absence of a social hierarchy. I couldn’t trade or give away gazillions of them so multiplying wouldn’t add anything.
I suppose the nanofab can manufacture novelty (though it isn’t quite clear from your description). But it won’t make minds. This is a problem. Humans are quite big on belonging to a society. I can’t imagine what being an immortal god of solipsistic matrix would feel like but I suspect it could be horrible.
The prohibition against creating minds isn’t very clear as we don’t have a clear idea on what constitutes a mind. Maybe I could ask Omega to generate the best possible RPG game with an entire simulated world and super-realistic NPCs? Would that be allowed? I don’t know if a sufficiently high-fidelity simulation of a person isn’t an actual person. And there would be at least one mind—me. Could I self-modify, grow my sense of empathy to epic proportions and start imagining people into being? And then, to fix my past sins, I’d order a book “Everything You Could Ever Ask About Frank” or something.
I think we should steelman this by stipulating that if you don’t take the trade, neither you nor Frank will die any time soon. You will both live out a normal human lifespan, just a very dull one.
It gets even more interesting if Frank is an immortal in this scenario, and the only way to get the machine is to make him mortal, perhaps with some small probability epsilon. How small does epsilon have to be before you (or Frank) will agree to such a trade?
This is basically what I intended with the White Room: make things as simple as possible.
Ironically, this may require a statement that you and Frank will return to the real world after this trade… (except I can’t do that because then the obvious solution is “take the nanofab, go make Hourai Elixirs for everyone, ω^2 utility beats ω.” Argh.)
… Ehhhh… I think I’m going to have to expand Unseelie’s job here. In general, the nanofab is capable of creating anything you want that’s secularly interesting (so, yes, you can have your eternally fun RPG game, though the NPCs aren’t going to pass the Turing test), but no method of resurrecting Frank, or creating another intelligence, can work.
Unseelie has to be more powerful than that; Emile pointed out that I could just simulate a mind with enough rocks (or Sofa Cushions). Unseelie also has to make sure my mind is never powerful enough to simulate another mind. That involves either changing me or preventing me from self-improving, so self-improvement is probably disallowed or severely limited if we keep the prohibition on Unseelie changing me.
Maybe create a GLUP that always does exactly what Frank would’ve done, but isn’t sentient?
I think the easiest way to steelman the loneliness problem presented by the given scenario is to just have a third person, let’s say Jane, who stays around regardless of whether you kill Frank or not.