It sounds less like he rewrote his natural morality and more like he engaged in a lot of motivated reasoning to justify his selfish behaviour. Rational Utilitarianism is the greatest good for the greatest number given the constraints of imperfect information and faulty brains. The idea that other people don’t have worth because they aren’t as prosocial as you is not Rational Utilitarianism (especially when you aren’t actually prosocial because you don’t value other people).
If whoever it is can’t feel much sympathy for people in distant countries then that is fine, plenty of people are like that. The good thing about consequentalism is that it doesn’t care about why. You could do it for self-esteem, social status, empathy or whatever but you still save lives either way. Declaring yourself a Rational Utilitarian and then not contributing is just a dishonest way of making yourself feel superior. To be a Rational Utilitarian you need to be a rationalist first and that means examining your beliefs even when they are pleasant.
Rational Utilitarianism is the greatest good for the greatest number given the constraints of imperfect information and faulty brains.
No; I object to your claiming the term “rational” for that usage. That’s just plain-old Utilitarianism 1.0 anyway; it doesn’t take a modifier.
Rationality plus Utilitarianism plus evolutionary psychology leads to the idea that a rational person is one who satisfies their own goals. You can’t call trying to achieve the greatest good for the greatest number of people “rational” for an evolved organism.
Rational Utilitarianism is the greatest good for the greatest number given the constraints of imperfect information and faulty brains.
Rationality is the art of making better decisions in service to a goal taking into account imperfect information and the constratints of our mental hardware. When applied to utilitarianism you get posts like this Nobody is perfect, evertyhing is commensurable
Rationality plus Utilitarianism plus evolutionary psychology leads to the idea that a rational person is one who satisfies their own goals.
I don’t see how this follows. Evolutionary psychology provides some explanations for our intuitions and instincts that the majority of humans share but that doesn’t really say anything about morality as Is Cannot Imply Ought.
Some quotes from the wiki page on evolutionary psychology.
We are optimized for an “ancestral environment” (often referred to as EEA, for “environment of evolutionary adaptedness”) that differs significantly from the environments in which most of us live. In the ancestral environment, calories were the limiting resource, so our tastebuds are built to like sugar and fat.
Evolution’s purposes also differ from our own purposes. We are built to deceive ourselves because self-deceivers were more effective liars in ancestral political disputes; and this fact about our underlying brain design doesn’t change when we try to make a moral commitment to truth and rationality.
I don’t see how this follows. Evolutionary psychology provides some explanations for our intuitions and instincts that the majority of humans share but that doesn’t really say anything about morality as Is Cannot Imply Ought.
Start by saying “rationality” means satisficing your goals and values. The issue is what values you have. You certainly have selfish values. A human also has values that lead to optimizing group survival. Behavior oriented primarily towards those goals is called altruistic.
The model of rationality presented on LessWrong usually treats goals and values that are of negative utility to the agent as biases or errors rather than as goals evolved to benefit the group or the genes. That leads to a view of rationality as strictly optimizing selfish goals.
As to old Utilitarianism 1.0, where somebody just declares by fiat that we are all interested in the greatest good for the greatest number of people—that isn’t on the table anymore. People don’t do that. Anyone who brings that up is the one asserting an “ought” with no justification. There is no need to talk about “oughts” yet.
Rationality means achieving your goals and values efficiently and effectively.
The model of rationality presented on LessWrong usually treats goals and values that are of negative utility to the agent as biases or errors rather than as goals evolved to benefit the group or the genes. That leads to a view of rationality as strictly optimizing selfish goals.
This is a false dichotomy. Just because a value is not of negative utility doesn’t mean it is optimized to benefit the genes. Scott Alexander for example is asexual and there are plenty of gay people.
As to old Utilitarianism 1.0, where somebody just declares by fiat that we are all interested in the greatest good for the greatest number of people—that isn’t on the table anymore. People don’t do that.
GiveWell exists, Peter Singer exists. The Effective Altrusim movement exists. They may not be perfect utilitarians but most rationalists aren’t perfect either, neither are most christians and they still exist.
This ended up giving him worse-than-human morality, because he assumes that humans are not actually moral—that humans don’t derive utility from helping others. He ended up convincing himself to do the selfish things that he thinks are “in his own best interests” in order to be a good rationalist, even in cases where he didn’t really want to be selfish
I finally remembered the Less Wrong meta-ethics sequence which you should read. This in particular.
It sounds less like he rewrote his natural morality and more like he engaged in a lot of motivated reasoning to justify his selfish behaviour. Rational Utilitarianism is the greatest good for the greatest number given the constraints of imperfect information and faulty brains. The idea that other people don’t have worth because they aren’t as prosocial as you is not Rational Utilitarianism (especially when you aren’t actually prosocial because you don’t value other people).
If whoever it is can’t feel much sympathy for people in distant countries then that is fine, plenty of people are like that. The good thing about consequentalism is that it doesn’t care about why. You could do it for self-esteem, social status, empathy or whatever but you still save lives either way. Declaring yourself a Rational Utilitarian and then not contributing is just a dishonest way of making yourself feel superior. To be a Rational Utilitarian you need to be a rationalist first and that means examining your beliefs even when they are pleasant.
No; I object to your claiming the term “rational” for that usage. That’s just plain-old Utilitarianism 1.0 anyway; it doesn’t take a modifier.
Rationality plus Utilitarianism plus evolutionary psychology leads to the idea that a rational person is one who satisfies their own goals. You can’t call trying to achieve the greatest good for the greatest number of people “rational” for an evolved organism.
Rationality is the art of making better decisions in service to a goal taking into account imperfect information and the constratints of our mental hardware. When applied to utilitarianism you get posts like this Nobody is perfect, evertyhing is commensurable
I don’t see how this follows. Evolutionary psychology provides some explanations for our intuitions and instincts that the majority of humans share but that doesn’t really say anything about morality as Is Cannot Imply Ought. Some quotes from the wiki page on evolutionary psychology.
Start by saying “rationality” means satisficing your goals and values. The issue is what values you have. You certainly have selfish values. A human also has values that lead to optimizing group survival. Behavior oriented primarily towards those goals is called altruistic.
The model of rationality presented on LessWrong usually treats goals and values that are of negative utility to the agent as biases or errors rather than as goals evolved to benefit the group or the genes. That leads to a view of rationality as strictly optimizing selfish goals.
As to old Utilitarianism 1.0, where somebody just declares by fiat that we are all interested in the greatest good for the greatest number of people—that isn’t on the table anymore. People don’t do that. Anyone who brings that up is the one asserting an “ought” with no justification. There is no need to talk about “oughts” yet.
Rationality means achieving your goals and values efficiently and effectively.
This is a false dichotomy. Just because a value is not of negative utility doesn’t mean it is optimized to benefit the genes. Scott Alexander for example is asexual and there are plenty of gay people.
GiveWell exists, Peter Singer exists. The Effective Altrusim movement exists. They may not be perfect utilitarians but most rationalists aren’t perfect either, neither are most christians and they still exist.
I finally remembered the Less Wrong meta-ethics sequence which you should read. This in particular.