Worrying that you might experience such pain/sorrow/disutility, but not worrying that you might miss out on orders of magnitude more pleasure/satisfaction/utility than humans currently expect is one asymmetry to explore. The other is worrying that you might experience it, more than worrying that trillions (or 3^^^3) ems might experience it.
Having a reasoned explanation for your intuitions to be so lopsided regarding risk and reward, and regarding self and aggregate, will very much help you calculate the best actions to navigate between the extremes.
It’s more a selfish worry, tbh. I don’t buy that pleasure being unlimited can cancel it out though—even if I were promised a 99.9% chance of Heaven and 0.1% chance of Hell, I still wouldn’t want both pleasure and pain to be potentially boundless.
Worrying that you might experience such pain/sorrow/disutility, but not worrying that you might miss out on orders of magnitude more pleasure/satisfaction/utility than humans currently expect is one asymmetry to explore. The other is worrying that you might experience it, more than worrying that trillions (or 3^^^3) ems might experience it.
Having a reasoned explanation for your intuitions to be so lopsided regarding risk and reward, and regarding self and aggregate, will very much help you calculate the best actions to navigate between the extremes.
It’s more a selfish worry, tbh. I don’t buy that pleasure being unlimited can cancel it out though—even if I were promised a 99.9% chance of Heaven and 0.1% chance of Hell, I still wouldn’t want both pleasure and pain to be potentially boundless.