although humans are free-willed, some actions do or do not contribute to achieving a human’s natural purpose in life. Said purpose is meant to be coherent, unlike evolutionary purpose- so better achievement would lead to achieving satisfaction in the long run
If I understand you correctly, your claim that if this turns out to be true, then I ought to perform those acts which contribute to achieving my natural purpose, whether I net-value satisfaction or not. Yes?
When there is a conflict between two desires both which feel like they have some claim to moral rightness, correct metaethics is essential to sort out what best to do.
Is it? It seems like object-level ethics achieves this purpose perfectly well. If it returns the result that they are equally good to do, then the correct thing to do is pick one. What do I need metaethics for, here?
The probability of that theory in reality is very, very low- it is a hypothetical universe. However, given that human beings have a tendency to define ethics in an Objective light in such a universe it would make sense to call it “objective ethics”. Admittedly I assume you value satisfaction here, but my argument is about what to call moral behaviour more than what you ‘should’ do.
Assuming Eliezer’s metaethics is actually true, you have a very good point. Eliezer, however, might argue that it is necessary to avoid becoming a ‘morality pump’- doing a series of actions which feel right but which have effects in the world that cancel each other out or end up at a clear loss.
However, there are other plausible theories. One possible theory (similiar to one I once held but which I’m not sure about now) would say that you need to think through the implications of both courses of action and how you would feel about the results as best as you can so you don’t regret your decision.
In addition, you should at least concede that your theory only works in this universe, not in some possible universes. It really depends upon the assumption that Eliezer’s metaethics or something similiar to is the true metaethics.
I apologize, but after reading this a few times I don’t really understand what you’re saying here, not even approximately enough to ask clarifying questions. It’s probably best to drop the thread here.
If I understand you correctly, your claim that if this turns out to be true, then I ought to perform those acts which contribute to achieving my natural purpose, whether I net-value satisfaction or not. Yes?
Is it? It seems like object-level ethics achieves this purpose perfectly well. If it returns the result that they are equally good to do, then the correct thing to do is pick one. What do I need metaethics for, here?
The probability of that theory in reality is very, very low- it is a hypothetical universe. However, given that human beings have a tendency to define ethics in an Objective light in such a universe it would make sense to call it “objective ethics”. Admittedly I assume you value satisfaction here, but my argument is about what to call moral behaviour more than what you ‘should’ do.
Assuming Eliezer’s metaethics is actually true, you have a very good point. Eliezer, however, might argue that it is necessary to avoid becoming a ‘morality pump’- doing a series of actions which feel right but which have effects in the world that cancel each other out or end up at a clear loss.
However, there are other plausible theories. One possible theory (similiar to one I once held but which I’m not sure about now) would say that you need to think through the implications of both courses of action and how you would feel about the results as best as you can so you don’t regret your decision.
In addition, you should at least concede that your theory only works in this universe, not in some possible universes. It really depends upon the assumption that Eliezer’s metaethics or something similiar to is the true metaethics.
I apologize, but after reading this a few times I don’t really understand what you’re saying here, not even approximately enough to ask clarifying questions. It’s probably best to drop the thread here.