Do we value saving lives independently of the good feelings we get from it?
I recently thought about something similar. I am really keen to learn and understand everything, how the universe works. But this might be dangerous, compared to faking those feelings by immersing myself into a virtual reality, where I feel like I constantly discover new and deep truths about reality. That way I could possibly experience a billion times as much pleasure, excitement and any other possible reactions my body is capable of, compared to discovering the true truth once and for real. Nevertheless, I would always choose the real thing over the simulated. Is that irrational?
All we want and do, we do because it causes our body (brain (us)) to become satisfied, because it makes us feel good in various different ways, and bad if we don’t do it. Therefore, can I objectively justify to assign more weight (utility) to a goal that causes a lot less of that bodily payoff?
I don’t know how I am confused. I suppose I am asking for some objective grounding of the notion of utility, and how maximizing it, irregardless of the character of its causation, wouldn’t be the rational choice. Otherwise, as David Hume once wrote, “`Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger.”
I recently thought about something similar. I am really keen to learn and understand everything, how the universe works. But this might be dangerous, compared to faking those feelings by immersing myself into a virtual reality, where I feel like I constantly discover new and deep truths about reality. That way I could possibly experience a billion times as much pleasure, excitement and any other possible reactions my body is capable of, compared to discovering the true truth once and for real. Nevertheless, I would always choose the real thing over the simulated. Is that irrational?
All we want and do, we do because it causes our body (brain (us)) to become satisfied, because it makes us feel good in various different ways, and bad if we don’t do it. Therefore, can I objectively justify to assign more weight (utility) to a goal that causes a lot less of that bodily payoff?
I don’t know how I am confused. I suppose I am asking for some objective grounding of the notion of utility, and how maximizing it, irregardless of the character of its causation, wouldn’t be the rational choice. Otherwise, as David Hume once wrote, “`Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger.”