I understand EY thinks that if you simulate enough neurons sufficiently well you get something that’s conscious.
Without specifying the arrangements of those neurons? Of course it should if you copy the arrangement of neurons out of a real person, say, but that doesn’t sound like what you meant.
Without specifying the arrangements of those neurons? Of course it should if you copy the arrangement of neurons out of a real person, say, but that doesn’t sound like what you meant.