It’s the first “reasoning”, not the second, that’s causing the third. Reasoning about puppies causes reasoning, not puppies.
Is it possible for a simulator, that doesn’t physically incorporate a human brain, to reason just as we do?
Yes.
It’s the first “reasoning”, not the second, that’s causing the third. Reasoning about puppies causes reasoning, not puppies.
Is it possible for a simulator, that doesn’t physically incorporate a human brain, to reason just as we do?
Yes.