Thinking a sufficiently advanced thought comes with a responsibility for seeing it grow up. Its mortality doesn’t clearly mean that it should’ve never lived.
My best guess about what you mean is that you are referring to the part in the “Ethics” section where I recommend just not creating such mental models in the first place?
To some extent I agree that mortality doesn’t mean it should’ve never lived, and indeed I am not against having children. However, after stumbling on the power to create lives that are entirely at my mercy and very high-maintenance to keep alive, I became more deontological about my approach to the ethics of creating lives. I think it’s okay to create lives, but you must put in a best effort to give them the best life that you can. For mental models, that includes keeping them alive for as long as you do, letting them interact with the world, and not lying to them. I think that following this rule leads to better outcomes than not following it.
letting them interact with the world, and not lying to them
The same way you can simulate characters that are not physical people on this world, and simulate their emotions without experienceing them yourself, you can simulate a world where they live. The fact that you are simulating them doesn’t affect the facts of what’s happening in that world.
For example, lying to make them believe that they actually are in a fictional world, that they are X years old, that they have Y job, etc.
Platonically, there are self-aware people in their own world. Saying that the world is fictional, or that they are characters, or that they are not X years old, that they don’t have Y job, would be misleading. Also, you can’t say it to them in their world, since you are not in their world. You can only say it to them in your world, which requires instantiating them in your world, away from all they know.
Then there are mental models of those people, who are characters from a fictional world, not X years old, don’t have Y job, live in your head. These mental models have the distinction of usually not being self-aware. When you explain their situation to them, you are making them self-aware.
Thinking a sufficiently advanced thought comes with a responsibility for seeing it grow up. Its mortality doesn’t clearly mean that it should’ve never lived.
My best guess about what you mean is that you are referring to the part in the “Ethics” section where I recommend just not creating such mental models in the first place?
To some extent I agree that mortality doesn’t mean it should’ve never lived, and indeed I am not against having children. However, after stumbling on the power to create lives that are entirely at my mercy and very high-maintenance to keep alive, I became more deontological about my approach to the ethics of creating lives. I think it’s okay to create lives, but you must put in a best effort to give them the best life that you can. For mental models, that includes keeping them alive for as long as you do, letting them interact with the world, and not lying to them. I think that following this rule leads to better outcomes than not following it.
The same way you can simulate characters that are not physical people on this world, and simulate their emotions without experienceing them yourself, you can simulate a world where they live. The fact that you are simulating them doesn’t affect the facts of what’s happening in that world.
Platonically, there are self-aware people in their own world. Saying that the world is fictional, or that they are characters, or that they are not X years old, that they don’t have Y job, would be misleading. Also, you can’t say it to them in their world, since you are not in their world. You can only say it to them in your world, which requires instantiating them in your world, away from all they know.
Then there are mental models of those people, who are characters from a fictional world, not X years old, don’t have Y job, live in your head. These mental models have the distinction of usually not being self-aware. When you explain their situation to them, you are making them self-aware.