For if all goes well, the question “What is fun?” shall determine the shape and pattern of a billion galaxies.
I object to most of the things Eliezer wants for the far future, but of all the sentences he has written lately, that is probably the one I object to most unequivocally. A billion galaxies devoted to fun does not leave Earth-originating intelligence at lot to devote to things that might be actually important.
That is my dyspeptic two cents.
Not wanting to be in a rotten mood keeps me from closely reading this series on fun and the earlier series on sentience or personhood, but I have detected no indication of how Eliezer would resolve a conflict between the terminal values he is describing. If for example, he learned that the will of the people, oops, I mean, the collective volition, oops, I mean, the coherent extrapolated volition does not want fun, would he reject the coherent extrapolated volition or would he resign himself to a future of severely submaximal quantities of fun?
I object to most of the things Eliezer wants for the far future, but of all the sentences he has written lately, that is probably the one I object to most unequivocally. A billion galaxies devoted to fun does not leave Earth-originating intelligence at lot to devote to things that might be actually important.
That is my dyspeptic two cents.
Not wanting to be in a rotten mood keeps me from closely reading this series on fun and the earlier series on sentience or personhood, but I have detected no indication of how Eliezer would resolve a conflict between the terminal values he is describing. If for example, he learned that the will of the people, oops, I mean, the collective volition, oops, I mean, the coherent extrapolated volition does not want fun, would he reject the coherent extrapolated volition or would he resign himself to a future of severely submaximal quantities of fun?