One possibility is “a lot”, in that humans seem to interpret pain on a logarithmic scale such that 2⁄10 pain is 10x worse than 1⁄10 pain, etc.. However, there is likely some physiological limit to how much sensor data the human brain can process as pain and still register it as pain and suffer from it. This leaves out the possibility of modifying humans in ways that would allow them to experience greater pain.
Note that I also think this question is exactly symmetrical to asking “what’s the maximum level of pleasure”, and so likely the answer to one is the answer to the other.
I do agree that they’re symmetrical. I just find it worrying that I could potentially experience such enormous amounts of pain, even when the opposite is also a possibility.
Worrying that you might experience such pain/sorrow/disutility, but not worrying that you might miss out on orders of magnitude more pleasure/satisfaction/utility than humans currently expect is one asymmetry to explore. The other is worrying that you might experience it, more than worrying that trillions (or 3^^^3) ems might experience it.
Having a reasoned explanation for your intuitions to be so lopsided regarding risk and reward, and regarding self and aggregate, will very much help you calculate the best actions to navigate between the extremes.
It’s more a selfish worry, tbh. I don’t buy that pleasure being unlimited can cancel it out though—even if I were promised a 99.9% chance of Heaven and 0.1% chance of Hell, I still wouldn’t want both pleasure and pain to be potentially boundless.
One possibility is “a lot”, in that humans seem to interpret pain on a logarithmic scale such that 2⁄10 pain is 10x worse than 1⁄10 pain, etc.. However, there is likely some physiological limit to how much sensor data the human brain can process as pain and still register it as pain and suffer from it. This leaves out the possibility of modifying humans in ways that would allow them to experience greater pain.
Note that I also think this question is exactly symmetrical to asking “what’s the maximum level of pleasure”, and so likely the answer to one is the answer to the other.
I think it’s the modifying humans to experience pain part that’s the most terrifying, to be honest.
Interesting intuition. How do you feel about modifying humans (or yourself) to experience more pleasure? If they’re not symmetrical, why not?
I do agree that they’re symmetrical. I just find it worrying that I could potentially experience such enormous amounts of pain, even when the opposite is also a possibility.
Worrying that you might experience such pain/sorrow/disutility, but not worrying that you might miss out on orders of magnitude more pleasure/satisfaction/utility than humans currently expect is one asymmetry to explore. The other is worrying that you might experience it, more than worrying that trillions (or 3^^^3) ems might experience it.
Having a reasoned explanation for your intuitions to be so lopsided regarding risk and reward, and regarding self and aggregate, will very much help you calculate the best actions to navigate between the extremes.
It’s more a selfish worry, tbh. I don’t buy that pleasure being unlimited can cancel it out though—even if I were promised a 99.9% chance of Heaven and 0.1% chance of Hell, I still wouldn’t want both pleasure and pain to be potentially boundless.