It doesn’t matter how many fake versions of you hold the wrong conclusion about their own ontological status, since those fake beliefs exist in fake versions of you. The moral harm caused by a single real Chantiel thinking they’re not real is infinitely greater than infinitely many non-real Chantiels thinking they are real.
Interesting. When you say “fake” versions of myself, do you mean simulations? If so, I’m having a hard time seeing how that could be true. Specifically, what’s wrong about me thinking I might not be “real”? I mean, if I though I was in a simulation, I think I’d do pretty much the same things I would do if I thought I wasn’t in a simulation. So I’m not sure what the moral harm is.
Do you have any links to previous discussions about this?
Interesting. When you say “fake” versions of myself, do you mean simulations? If so, I’m having a hard time seeing how that could be true. Specifically, what’s wrong about me thinking I might not be “real”? I mean, if I though I was in a simulation, I think I’d do pretty much the same things I would do if I thought I wasn’t in a simulation. So I’m not sure what the moral harm is.
Do you have any links to previous discussions about this?