Also note that the whole “extend moral judgment” concept is yours, I said nothing about moral judgements, only possible decisions.
What I meant is simply that decisions are made based on valuation of their consequences. I consistently use “morality” in this sense.
When the very fate of the universe is at stake I can most certainly make decisions based on inferences from whatever information I have available, including the use of the letters C, E and V.
I agree. What I took issue with about your comment was perceived certainty of the decision. Under severe uncertainty, your current guess at the correct decision may well be “stop Eliezer”, but I don’t see how with present state of knowledge one can have any certainty in the matter. And you did say that it’s “quite likely” that CEV-derived AGI is undesirable:
The coherent extrapolated voilition of all of humanity is quite likely to be highly undesirable. I sincerely hope Eliezer was lying when he said that.
(Why are you angry? Do you need that old murder discussion resolved? Some other reason?)
What I meant is simply that decisions are made based on valuation of their consequences. I consistently use “morality” in this sense.
I agree. What I took issue with about your comment was perceived certainty of the decision. Under severe uncertainty, your current guess at the correct decision may well be “stop Eliezer”, but I don’t see how with present state of knowledge one can have any certainty in the matter. And you did say that it’s “quite likely” that CEV-derived AGI is undesirable:
(Why are you angry? Do you need that old murder discussion resolved? Some other reason?)