Even though I’ve only just begun dabbling in meditation, I believe that it is an incredibly powerful tool. I do have one worry though relating to what exactly counts as suffering. If I scald myself, this triggers my pain nerves to fire and it is possible that this produces negative utility, even if my conscious mind is able to maintain complete equanimity?
And while it might produce less negative utility in most cases, it is possible that someone who can achieve equanimity might voluntarily choose to put themselves in situations where they experience far more suffering than they ever would without this ability with the end result being net negative for them. A large part of what worries me is that I am massively uncertain about this issue and I honestly don’t have any idea of how we could settle it conclusively.
Some factors that might make us more likely to believe this is the case:
If we believe that insects feel pain that is axiologically relevant, then this would demonstrate that higher-level processing isn’t a necessary component
Multiagent models of the brain make it more plausible that one might experience pain, whilst another isn’t
Panpsychism could be taken to suggest that it is possible for individual components of the brain to suffer independently
Does equanimity prevent negative utility?
Even though I’ve only just begun dabbling in meditation, I believe that it is an incredibly powerful tool. I do have one worry though relating to what exactly counts as suffering. If I scald myself, this triggers my pain nerves to fire and it is possible that this produces negative utility, even if my conscious mind is able to maintain complete equanimity?
And while it might produce less negative utility in most cases, it is possible that someone who can achieve equanimity might voluntarily choose to put themselves in situations where they experience far more suffering than they ever would without this ability with the end result being net negative for them. A large part of what worries me is that I am massively uncertain about this issue and I honestly don’t have any idea of how we could settle it conclusively.
Some factors that might make us more likely to believe this is the case:
If we believe that insects feel pain that is axiologically relevant, then this would demonstrate that higher-level processing isn’t a necessary component
Multiagent models of the brain make it more plausible that one might experience pain, whilst another isn’t
Panpsychism could be taken to suggest that it is possible for individual components of the brain to suffer independently