The probability of that theory in reality is very, very low- it is a hypothetical universe. However, given that human beings have a tendency to define ethics in an Objective light in such a universe it would make sense to call it “objective ethics”. Admittedly I assume you value satisfaction here, but my argument is about what to call moral behaviour more than what you ‘should’ do.
Assuming Eliezer’s metaethics is actually true, you have a very good point. Eliezer, however, might argue that it is necessary to avoid becoming a ‘morality pump’- doing a series of actions which feel right but which have effects in the world that cancel each other out or end up at a clear loss.
However, there are other plausible theories. One possible theory (similiar to one I once held but which I’m not sure about now) would say that you need to think through the implications of both courses of action and how you would feel about the results as best as you can so you don’t regret your decision.
In addition, you should at least concede that your theory only works in this universe, not in some possible universes. It really depends upon the assumption that Eliezer’s metaethics or something similiar to is the true metaethics.
I apologize, but after reading this a few times I don’t really understand what you’re saying here, not even approximately enough to ask clarifying questions. It’s probably best to drop the thread here.
The probability of that theory in reality is very, very low- it is a hypothetical universe. However, given that human beings have a tendency to define ethics in an Objective light in such a universe it would make sense to call it “objective ethics”. Admittedly I assume you value satisfaction here, but my argument is about what to call moral behaviour more than what you ‘should’ do.
Assuming Eliezer’s metaethics is actually true, you have a very good point. Eliezer, however, might argue that it is necessary to avoid becoming a ‘morality pump’- doing a series of actions which feel right but which have effects in the world that cancel each other out or end up at a clear loss.
However, there are other plausible theories. One possible theory (similiar to one I once held but which I’m not sure about now) would say that you need to think through the implications of both courses of action and how you would feel about the results as best as you can so you don’t regret your decision.
In addition, you should at least concede that your theory only works in this universe, not in some possible universes. It really depends upon the assumption that Eliezer’s metaethics or something similiar to is the true metaethics.
I apologize, but after reading this a few times I don’t really understand what you’re saying here, not even approximately enough to ask clarifying questions. It’s probably best to drop the thread here.