The question is what that emotional core is and where it comes from, because that’s what your utility function is. Rationalists emotionally value the perception of being correct and like to use reason to attain this perception. In order to apply rationality equally to all your mappings of the territory, you have to have no other emotional impulses that conflict with decoding the raw truth instead of the truth you prefer to see. If you just really like guns and perceiving yourself as correct, and don’t value anything else, you’ll do everything you can to minimize people giving you trouble because you own firearms. If this means never shooting other agents, so be it.
I can guess this comment won’t be popular the moment I formally assert that the brain is ultimately just a Bayesian calculating machine, with the utility functions (determined via deep emotional decision) being the only fundamental difference between two people. Of course, up at the level of actual human interaction, the conscious mind isn’t the Bayesian agent. The real Bayesian agent is the emotional core that’s influencing all the higher-order operations of your brain—consciousness and the reasoning it perceives included. This information can only really be processed by agents with a high ratio of utilitarianism in their utility function; those among you that actually care about what other people think. Might as well be speaking Greek to anyone else.
The question is what that emotional core is and where it comes from, because that’s what your utility function is. Rationalists emotionally value the perception of being correct and like to use reason to attain this perception. In order to apply rationality equally to all your mappings of the territory, you have to have no other emotional impulses that conflict with decoding the raw truth instead of the truth you prefer to see. If you just really like guns and perceiving yourself as correct, and don’t value anything else, you’ll do everything you can to minimize people giving you trouble because you own firearms. If this means never shooting other agents, so be it.
I can guess this comment won’t be popular the moment I formally assert that the brain is ultimately just a Bayesian calculating machine, with the utility functions (determined via deep emotional decision) being the only fundamental difference between two people. Of course, up at the level of actual human interaction, the conscious mind isn’t the Bayesian agent. The real Bayesian agent is the emotional core that’s influencing all the higher-order operations of your brain—consciousness and the reasoning it perceives included. This information can only really be processed by agents with a high ratio of utilitarianism in their utility function; those among you that actually care about what other people think. Might as well be speaking Greek to anyone else.