Well, even if we reliably know that certain optimizations make copies not conscious, some people may want to run optimized versions of themselves that are not conscious. People are already making LLMs of themselves based on their writings and stuff. I think Age of Em doesn’t discuss this specific case, but collectives of variously modified Ems may perform better (if only for being cheaper) if they are not conscious. Humans Who Are Not Concentrating Are Not General Intelligences and often not conscious. I’m not conscious when I’m deeply immersed in some subject and only hours later realize how much time has passed—and how much I got done. It’s a kind of automation. Why not run it intentionally?
Well, even if we reliably know that certain optimizations make copies not conscious, some people may want to run optimized versions of themselves that are not conscious. People are already making LLMs of themselves based on their writings and stuff. I think Age of Em doesn’t discuss this specific case, but collectives of variously modified Ems may perform better (if only for being cheaper) if they are not conscious. Humans Who Are Not Concentrating Are Not General Intelligences and often not conscious. I’m not conscious when I’m deeply immersed in some subject and only hours later realize how much time has passed—and how much I got done. It’s a kind of automation. Why not run it intentionally?