But you see Eliezer’s comments because a conscious copy of Eliezer has been run.
A conscious copy of Eliezer that thought about what Eliezer would do when faced with that situation, not a conscious copy of Eliezer actually faced with that situation—the latter Eliezer is still an l-zombie, if we live in a world with l-zombies.
Is Eliezer thinking about what he would do when faced with that situation not him running an extremely simplified simulation of himself? Obviously this simulation is not equivalent to real Eliezer, but there’s clearly something being run here, so it can’t be an L-zombie.
A conscious copy of Eliezer that thought about what Eliezer would do when faced with that situation, not a conscious copy of Eliezer actually faced with that situation—the latter Eliezer is still an l-zombie, if we live in a world with l-zombies.
Is Eliezer thinking about what he would do when faced with that situation not him running an extremely simplified simulation of himself? Obviously this simulation is not equivalent to real Eliezer, but there’s clearly something being run here, so it can’t be an L-zombie.