That the simulated copy of you would somehow not perceive itself as you? That the process just can’t work and can’t create anything recognizably conscious, intelligent, or human?
Don’t worry—the comments by Mitchell_Porter in this comment thread were actually written by a vortexless simulation of an entirely separate envortexed individual who also comments under that account. So here, all of the apparent semantic content of “Mitchell_Porter”’s comments is illusory. The comments are actually meaningless syntactically-generated junk—just the emissions of a very complex ELIZA chatbot.
Don’t worry—the comments by Mitchell_Porter in this comment thread were actually written by a vortexless simulation of an entirely separate envortexed individual who also comments under that account. So here, all of the apparent semantic content of “Mitchell_Porter”’s comments is illusory. The comments are actually meaningless syntactically-generated junk—just the emissions of a very complex ELIZA chatbot.