I’m starting to sympathize with PlaidX’scomplaint. If what you really want to ask is, “Could a flipbook be conscious?” then why not just say that? The torture is completely irrelevant.
Asking whether the simulation is morally relevant is putting the question as a decision problem, rather than classification by a poorly specified concept.
I’m reminded of a story in Orion’s Arm where a super intelligence is simulated with pencil and paper. This depiction isn’t a flipbook of course. In the story, a bunch of volunteer baseline human carried out the algorithm of a super intelligence doing the arithmetic by hand on pieces of paper. They did it as a hobby.
I’m starting to sympathize with PlaidX’s complaint. If what you really want to ask is, “Could a flipbook be conscious?” then why not just say that? The torture is completely irrelevant.
Asking whether the simulation is morally relevant is putting the question as a decision problem, rather than classification by a poorly specified concept.
But it is reasonable to expect that most decisions will be based solely on the poorly specified classification.
I’m reminded of a story in Orion’s Arm where a super intelligence is simulated with pencil and paper. This depiction isn’t a flipbook of course. In the story, a bunch of volunteer baseline human carried out the algorithm of a super intelligence doing the arithmetic by hand on pieces of paper. They did it as a hobby.
After searching for a while, I found the story.