I would be surprised if Eliezer believed (1) or (2), as distinct from believing that CEV[X] is the most viably actionable approximation of morality[X] (using your terminology) we’ve come up with thus far.
This reminds me somewhat of the difference between believing that 2013 cryonics technology reliably preserves the information content of a brain on the one hand, and on the other believing that 2013 cryonics technology has a higher chance of preserving the information than burial or cremation.
I agree that that he devotes a lot of time to arguing against (3), though I’ve always understood that as a reaction to the “but a superintelligent system would be smart enough to just figure out how to behave ethically and then do it!” crowd.
I would be surprised if Eliezer believed (1) or (2), as distinct from believing that CEV[X] is the most viably actionable approximation of morality[X] (using your terminology) we’ve come up with thus far.
I didn’t intend to distinguish that finely.
I’m not really sure what you mean by (4).
(4) is intended to mean that if we alter humans to have a different value system tomorrow, we would also be changing what we mean (today) by “morality”. It’s the negation of the assertion that moral terms are rigid designators, and is what Eliezer is arguing against in No License To Be Human.
I would be surprised if Eliezer believed (1) or (2), as distinct from believing that CEV[X] is the most viably actionable approximation of morality[X] (using your terminology) we’ve come up with thus far.
This reminds me somewhat of the difference between believing that 2013 cryonics technology reliably preserves the information content of a brain on the one hand, and on the other believing that 2013 cryonics technology has a higher chance of preserving the information than burial or cremation.
I agree that that he devotes a lot of time to arguing against (3), though I’ve always understood that as a reaction to the “but a superintelligent system would be smart enough to just figure out how to behave ethically and then do it!” crowd.
I’m not really sure what you mean by (4).
I didn’t intend to distinguish that finely.
(4) is intended to mean that if we alter humans to have a different value system tomorrow, we would also be changing what we mean (today) by “morality”. It’s the negation of the assertion that moral terms are rigid designators, and is what Eliezer is arguing against in No License To Be Human.
Ah, gotcha. OK, thanks for clarifying.