Upon closer reading, I notice that you are trying to draw a clear distinction between the implementation of CEV and the kind of world CEV produces. I had been thinking that
the implementation would have a big influence on the kind of world.
But you may be assuming that the world created by the FAI, under the guidance of the volition of mankind really depends on that volition and not on the programming fine print that implements “coherent” and “extrapolated”. Well, if you think that, and the price tags only buy you the opportunity to speculate on what mankind will actually want, … well …, yes, that is another possible interpretation.
Yeah. When I read that pricing schedule, what I see is Eliezer preempting:
enthusiastic singularitarians whiling away their hours dreaming about how everyone is going to have a rocketpack after the Singularity;
criticism of the form “CEV will do X, which is clearly bad. Therefore CEV is a bad idea.” (where X might be “restrict human autonomy”). This kind of criticism comes from people who don’t understand that CEV is an attempt to avoid doing any X that is not clearly good.
The CEV document continues to welcome other kinds of criticism, such as the objection that the coherent extrapolated volition of the entire species would be unacceptably worse for an individual than that of 1000 like-minded individuals (Roko says something like this) or a single individual (wedrifid says something like this) -- the psychological unity of mankind notwithstanding.
Upon closer reading, I notice that you are trying to draw a clear distinction between the implementation of CEV and the kind of world CEV produces. I had been thinking that the implementation would have a big influence on the kind of world.
But you may be assuming that the world created by the FAI, under the guidance of the volition of mankind really depends on that volition and not on the programming fine print that implements “coherent” and “extrapolated”. Well, if you think that, and the price tags only buy you the opportunity to speculate on what mankind will actually want, … well …, yes, that is another possible interpretation.
Yeah. When I read that pricing schedule, what I see is Eliezer preempting:
enthusiastic singularitarians whiling away their hours dreaming about how everyone is going to have a rocketpack after the Singularity;
criticism of the form “CEV will do X, which is clearly bad. Therefore CEV is a bad idea.” (where X might be “restrict human autonomy”). This kind of criticism comes from people who don’t understand that CEV is an attempt to avoid doing any X that is not clearly good.
The CEV document continues to welcome other kinds of criticism, such as the objection that the coherent extrapolated volition of the entire species would be unacceptably worse for an individual than that of 1000 like-minded individuals (Roko says something like this) or a single individual (wedrifid says something like this) -- the psychological unity of mankind notwithstanding.