@Eliezer: I think your definition of “optimization process” is a very good one, I just don’t think that the technological singularity [will neccessarily be]/[ought to be] an instance of it.
Eliezer: ” … you’re just finding new paths to the same goals, leading through portions of reality that you may not have suspected existed.”
This may be a point that we disagree on quite fundamentally. Has it occurred to you that one might introduce terminal values in a new, richer ontology which were not even possible to state in the old one? Surely: you’re aware that most of the things that an adult human considers to be of terminal value are not stateable in the ontology of a 3 year old (“Loyalty to Libertarianism”, “Mathematical Elegance”, “Fine Literature”, “Sexuality”, …); that most things a human considers to be of terminal value are not stateable in the ontology of a chimp.
I think that it is the possibility of finding new value-states which were simply unimaginable to an earlier version of oneself that attracts me to the transhumanist cause; if you cast the singularity as an optimization process you rule out this possibility from the start. An “optimization process” based version of the singuilarity will land us in something like Iain M Banks’ Culture, where human drives and desires are supersaturated by advanced technology, but nothing really new is done.
Furthermore, my desire to experience value-states that are simply unimaginable to the current version of me is simply not stateable as an optimization problem: for optimization always takes place over a known set of states (as you explained well in the OP).
@Eliezer: I think your definition of “optimization process” is a very good one, I just don’t think that the technological singularity [will neccessarily be]/[ought to be] an instance of it.
Eliezer: ” … you’re just finding new paths to the same goals, leading through portions of reality that you may not have suspected existed.”
This may be a point that we disagree on quite fundamentally. Has it occurred to you that one might introduce terminal values in a new, richer ontology which were not even possible to state in the old one? Surely: you’re aware that most of the things that an adult human considers to be of terminal value are not stateable in the ontology of a 3 year old (“Loyalty to Libertarianism”, “Mathematical Elegance”, “Fine Literature”, “Sexuality”, …); that most things a human considers to be of terminal value are not stateable in the ontology of a chimp.
I think that it is the possibility of finding new value-states which were simply unimaginable to an earlier version of oneself that attracts me to the transhumanist cause; if you cast the singularity as an optimization process you rule out this possibility from the start. An “optimization process” based version of the singuilarity will land us in something like Iain M Banks’ Culture, where human drives and desires are supersaturated by advanced technology, but nothing really new is done.
Furthermore, my desire to experience value-states that are simply unimaginable to the current version of me is simply not stateable as an optimization problem: for optimization always takes place over a known set of states (as you explained well in the OP).