But when you necessarily do not possess the computational power to track all the consequences of different strategies, or do not think strategically at all, then believing yourself to be an utilitarian (but not being one due to computational constraints) you will end up either not changing your behaviour or philosophizing yourself into psychopathy whereby you’ll rationalize virtually any form of immoral (net negative global utility) conduct. I do think that believing oneself to be an utilitarian while not having the hardware enough to track consequences is functionally equivalent to psychopathy whenever the belief does not work like dragon in the garage belief (you can virtually always alter the action a little bit and set up a partial sum to obtain positive; if you want to murder a co-worker, you can sell the organs and donate to charity for example). The belief that one is capable of accurately tracking consequences may also be a product of narcissism, which is a very antisocial trait.
But when you necessarily do not possess the computational power to track all the consequences of different strategies, or do not think strategically at all, then believing yourself to be an utilitarian (but not being one due to computational constraints) you will end up either not changing your behaviour or philosophizing yourself into psychopathy whereby you’ll rationalize virtually any form of immoral (net negative global utility) conduct. I do think that believing oneself to be an utilitarian while not having the hardware enough to track consequences is functionally equivalent to psychopathy whenever the belief does not work like dragon in the garage belief (you can virtually always alter the action a little bit and set up a partial sum to obtain positive; if you want to murder a co-worker, you can sell the organs and donate to charity for example). The belief that one is capable of accurately tracking consequences may also be a product of narcissism, which is a very antisocial trait.