ETA: Even if I’d be the only sentient being in the entire nanofabbed universe, it’s still better than 2 people trapped in a boring white room, either forever or until we both die of dehydration.
I would also accept a deal in which one of us at random is killed, and the other one gets the machine. And I don’t think it should make much of a difference whether the coin deciding who gets killed is flipped before or after Omega offers the choice, so I don’t feel too bad about choosing to kill Frank (just as I wouldn’t feel too outraged if Frank decided to kill me).
I would also find way more interesting things to do with the machine than seat cushions and the Mona Lisa—create worlds, robots, interesting machines, breed interesting plants, sculpt, paint …
Are you sure you thoroughly understood what Unseelie will prevent? No other minds, ever, by any means. My guess is that Unseelie will produce only basic foodstuffs filled with antibiotics and sterilizing agents (you might be female and capable of parthenogenesis, after all). almost anything else could be collected and assembled into a machine capable of hosting a mind, and Unseelie’s goal is to prevent any arbitrarily smart or lucky person from doing such a thing. Even seat cushions might be deemed too dangerous.
I don’t think this was a mistake in the specification of the problem; the choice is between a static, non-interactive universe (but as much as you want of it) and interaction with another human mind.
No minds doesn’t mean it isn’t interactive. A computer running minecraft shouldn’t count as a “mind”, and people spend hours in minecraft, or in Skyrim, or in Dwarf Fortress… as described, the offer is like minecraft, but “for real”.
Except that you can build a mind in Mindcraft or Dwarf Fortress since they’re Turing-complete, so Unseelie probably wouldn’t let you have them. Maybe I completely misunderstand the intent of the post but “Unseelie’s job is to artificially ensure that the fabricator cannot be used to make a mind; attempts at making any sort of intelligence, whether directly, by making a planet and letting life evolve, or anything else a human mind can come up with, will fail.” seems pretty airtight.
Perhaps you could ask Unseelie to role-play all the parts that would otherwise require minds in games (which would depend on Unseelie’s knowledge of consciousness and minds and its opinion on p-zombies), or ask Unseelie to unalterably embed itself into some Turing-complete game to prevent you from creating minds in it. For that matter, why not just ask it to role-play a doppleganger of Frank as accurately as possible? My guess is that Unseelie won’t produce copies of itself for use in games or Frank-sims because it probably self-identifies as an intelligence and/or mind.
Except that you can build a mind in Mindcraft or Dwarf Fortress since they’re Turing-complete, so Unseelie probably wouldn’t let you have them.
It could prove that no relevant mind is simulatable in the bounded amount of memory in the computer it gives you. This seems perfectly doable, since I don’t think anyone thinks that Minecraft or Dwarf Fortress take the same or more memory than an AI would...
It hasn’t given you a ‘universal Turing machine with unbounded memory’, it has given you a ‘finite-state machine’. Important difference, and this is one of the times it matters.
It hasn’t given you a ‘universal Turing machine with unbounded memory’, it has given you a ‘finite-state machine’. Important difference, and this is one of the times it matters.
Good point, and in that case Unseelie would have to limit what comes out of the nanofabricator to less than what could be reassembled into a more complex machine capable of intelligence. No unbounded numbers of seat cushions or any other type of token that you could use to make a physical tape and manual state machine, no piles of simpler electronic components or small computers that could be networked together.
The way I understood the problem you would be able to build a computer running Minecraft, and Unseelie would prevent you from using that computer to build an intelligence (as opposed to refusing to build a computer). If Unseelie refused to build potentially turing-complete things, that would drastically reduce what you can make, since you could scavenge bits of metal and eventually build a computer yourself. Heck, you could even make a simulation out of rocks.
But regardless of whether you can build a computer—with a miracle nanofabricator, you can do in the real world what you would do in minecraft! Who needs a computer when you can run around building castles and mountains and cities!
I was aware of those limitations and I think it renders the premise rather silly. “not being allowed to construct minds” is a very underspecified constraint.
I would accept the offer even if I knew for sure that I would be the one to die, mostly because the alternative seems to be living in a nightmare world.
If Frank agreed that randomness would be fair, and Omega specified that a coin flip had occurred, then the flip happening beforehand would not matter. But taking advantage of someone because I had better luck than they did seems immoral when we are not explicitly competing. It would be like picking someone’s pocket because they had been placed in front of me by the usher.
I would much rather have an indefinitely long Fun life than sit with frank in a white room for a few days until we both starve to death. I would be absolutely horrified if frank chose to reject the offer in my place, so I don’t really consider this preference selfish.
You could make an argument that it would still be right to take the offer, since me and frank will both die after a while anyway.
I expect I still probably wouldn’t kill frank though, since:
A: I’m not sure how to evaluate the utility of an infinite amount of time spent alone
B: I would feel like shit afterwards
C: Frank would prefer to live than die, and I would rather Frank live than die, therefore preference utilitarianism seems to be against the offer.
Me too. I think the reason is that it is basically impossible for me to imagine that life in your dull white room could actually be worth living for Frank.
Says someone whose intuitions in the original dust speck scenario are somewhat in favor of sparing the one person’s life.
I’d kill Frank.
ETA: Even if I’d be the only sentient being in the entire nanofabbed universe, it’s still better than 2 people trapped in a boring white room, either forever or until we both die of dehydration.
So would I.
I would also accept a deal in which one of us at random is killed, and the other one gets the machine. And I don’t think it should make much of a difference whether the coin deciding who gets killed is flipped before or after Omega offers the choice, so I don’t feel too bad about choosing to kill Frank (just as I wouldn’t feel too outraged if Frank decided to kill me).
I would also find way more interesting things to do with the machine than seat cushions and the Mona Lisa—create worlds, robots, interesting machines, breed interesting plants, sculpt, paint …
Are you sure you thoroughly understood what Unseelie will prevent? No other minds, ever, by any means. My guess is that Unseelie will produce only basic foodstuffs filled with antibiotics and sterilizing agents (you might be female and capable of parthenogenesis, after all). almost anything else could be collected and assembled into a machine capable of hosting a mind, and Unseelie’s goal is to prevent any arbitrarily smart or lucky person from doing such a thing. Even seat cushions might be deemed too dangerous.
I don’t think this was a mistake in the specification of the problem; the choice is between a static, non-interactive universe (but as much as you want of it) and interaction with another human mind.
No minds doesn’t mean it isn’t interactive. A computer running minecraft shouldn’t count as a “mind”, and people spend hours in minecraft, or in Skyrim, or in Dwarf Fortress… as described, the offer is like minecraft, but “for real”.
Except that you can build a mind in Mindcraft or Dwarf Fortress since they’re Turing-complete, so Unseelie probably wouldn’t let you have them. Maybe I completely misunderstand the intent of the post but “Unseelie’s job is to artificially ensure that the fabricator cannot be used to make a mind; attempts at making any sort of intelligence, whether directly, by making a planet and letting life evolve, or anything else a human mind can come up with, will fail.” seems pretty airtight.
Perhaps you could ask Unseelie to role-play all the parts that would otherwise require minds in games (which would depend on Unseelie’s knowledge of consciousness and minds and its opinion on p-zombies), or ask Unseelie to unalterably embed itself into some Turing-complete game to prevent you from creating minds in it. For that matter, why not just ask it to role-play a doppleganger of Frank as accurately as possible? My guess is that Unseelie won’t produce copies of itself for use in games or Frank-sims because it probably self-identifies as an intelligence and/or mind.
It could prove that no relevant mind is simulatable in the bounded amount of memory in the computer it gives you. This seems perfectly doable, since I don’t think anyone thinks that Minecraft or Dwarf Fortress take the same or more memory than an AI would...
It hasn’t given you a ‘universal Turing machine with unbounded memory’, it has given you a ‘finite-state machine’. Important difference, and this is one of the times it matters.
Good point, and in that case Unseelie would have to limit what comes out of the nanofabricator to less than what could be reassembled into a more complex machine capable of intelligence. No unbounded numbers of seat cushions or any other type of token that you could use to make a physical tape and manual state machine, no piles of simpler electronic components or small computers that could be networked together.
The way I understood the problem you would be able to build a computer running Minecraft, and Unseelie would prevent you from using that computer to build an intelligence (as opposed to refusing to build a computer). If Unseelie refused to build potentially turing-complete things, that would drastically reduce what you can make, since you could scavenge bits of metal and eventually build a computer yourself. Heck, you could even make a simulation out of rocks.
But regardless of whether you can build a computer—with a miracle nanofabricator, you can do in the real world what you would do in minecraft! Who needs a computer when you can run around building castles and mountains and cities!
I was aware of those limitations and I think it renders the premise rather silly. “not being allowed to construct minds” is a very underspecified constraint.
I’m not downvoting, because I don’t think you’ve made any sort of error in your comment, but I disagree (morally) with your choice.
Would you accept a deal where one of you (at random) gets killed, and the other gets the Miracle Machine?
I would accept the offer even if I knew for sure that I would be the one to die, mostly because the alternative seems to be living in a nightmare world.
In fact, a book has already been written describing hell very similarly. But even in that book, there were three people. And cushions.
What book?
Well, I should’ve said play (I’m one of those weirdos who read plays), but: No Exit.
If Frank agreed to it as well, maybe. It seems like it would be rather lonely.
Does it make much of a difference whether Omega flips the coin before or after he makes you the offer? Where do you draw the line?
If Frank agreed that randomness would be fair, and Omega specified that a coin flip had occurred, then the flip happening beforehand would not matter. But taking advantage of someone because I had better luck than they did seems immoral when we are not explicitly competing. It would be like picking someone’s pocket because they had been placed in front of me by the usher.
Honestly so would I.
I would much rather have an indefinitely long Fun life than sit with frank in a white room for a few days until we both starve to death. I would be absolutely horrified if frank chose to reject the offer in my place, so I don’t really consider this preference selfish.
What if the room was already fun and you already had an infinite supply of nice food?
You could make an argument that it would still be right to take the offer, since me and frank will both die after a while anyway.
I expect I still probably wouldn’t kill frank though, since: A: I’m not sure how to evaluate the utility of an infinite amount of time spent alone B: I would feel like shit afterwards C: Frank would prefer to live than die, and I would rather Frank live than die, therefore preference utilitarianism seems to be against the offer.
Least Convenient Possible World. Both you and Frank are otherwise immortal. Bored, perhaps, but immortal.
Me too. I think the reason is that it is basically impossible for me to imagine that life in your dull white room could actually be worth living for Frank.
Says someone whose intuitions in the original dust speck scenario are somewhat in favor of sparing the one person’s life.