Is the risk that we might simulate a person? I’d say no.
It’s worse.
We Natural Intelligences don’t just run simulations, we torture them.
It is recommended that authors “Be cruel to your characters”.
It’s not clear to me that the simulation an author runs when thinking about
a story isn’t already “a ‘simulation’ detailed enough to be a person in its own right”.
But it’s probably o.k., because the simulations we run in our heads aren’t
really that detailed, and aren’t really persons in the important sense, right?
So we don’t have to start screaming yet, unless...
It’s worse.
Because even if we aren’t able to create a simulation that good, an AI probably could.
We might not accept an AI as intelligent unless it can simulate a person well enough to fool us.
That is, simulating people might be a necessary, not just sufficient property of AI.
But still, we could, if we had to, avoid simulating people unless it was necessary and under ethical conditions.
Unless of course...
It’s worse.
Because while we might be ethical, there are certainly people out there who are not.
Once the AI genie is out of the bottle, the unethical people will capture one
and put it to work writing stories. And let’s face it, there are plenty of people who think
“Boy and girl raise family” isn’t as interesting a story as “Boy and girl raise family
from the dead and are dragged to hell.”
Once we have AI authors, some unscrupulous editors are going to want them to torture virtual people.
And if you think people aren’t that cruel and depraved, well...
I think I’m going to stop here. Because while I could go on,
there’s only so much screaming I can deal with.
There is a simple answer to this, but simple doesn’t mean pleasant.
We need only decide “God is always moral”. Above Good and Evil
if you like. You can do whatever you like to your own creations.
This might be a practical answer, but I find it distasteful.
The only reason it doesn’t make me want to scream and run away is
because while you can run, if you’re screaming you can’t hide.
Would a human, trying to solve the same problem, also run the risk of simulating a person?
See also: http://xkcd.com/390/
Is the risk that we might simulate a person? I’d say no.
It’s worse.
We Natural Intelligences don’t just run simulations, we torture them. It is recommended that authors “Be cruel to your characters”. It’s not clear to me that the simulation an author runs when thinking about a story isn’t already “a ‘simulation’ detailed enough to be a person in its own right”. But it’s probably o.k., because the simulations we run in our heads aren’t really that detailed, and aren’t really persons in the important sense, right? So we don’t have to start screaming yet, unless...
It’s worse.
Because even if we aren’t able to create a simulation that good, an AI probably could. We might not accept an AI as intelligent unless it can simulate a person well enough to fool us. That is, simulating people might be a necessary, not just sufficient property of AI. But still, we could, if we had to, avoid simulating people unless it was necessary and under ethical conditions. Unless of course...
It’s worse.
Because while we might be ethical, there are certainly people out there who are not. Once the AI genie is out of the bottle, the unethical people will capture one and put it to work writing stories. And let’s face it, there are plenty of people who think “Boy and girl raise family” isn’t as interesting a story as “Boy and girl raise family from the dead and are dragged to hell.” Once we have AI authors, some unscrupulous editors are going to want them to torture virtual people. And if you think people aren’t that cruel and depraved, well...
I think I’m going to stop here. Because while I could go on, there’s only so much screaming I can deal with.
There is a simple answer to this, but simple doesn’t mean pleasant. We need only decide “God is always moral”. Above Good and Evil if you like. You can do whatever you like to your own creations. This might be a practical answer, but I find it distasteful. The only reason it doesn’t make me want to scream and run away is because while you can run, if you’re screaming you can’t hide.
Alternative webcomic link: http://overcompensating.com/oc/index.php?comic=50