I am not entirely sure I disagree with you. However, I am having difficulty modeling you.
“Achieving a goal” seems to mean, for our purposes, something along the lines of “Bringing about a world-state.” Most possible world-states do not involve human existence. Thus, it seems that for most possible goals, achieving a goal entails human extinction.
However, your mention of computer programs being produced by intelligent agents is interesting. Are you implying that most AGI’s (assume these intelligences can go FOOM) would not result in human extinction?
If this is not what you were implying, I apologize for modeling you poorly. If this is what you were implying, I would like to indicate that this post was non-hostile.
Are you implying that most AGI’s (assume these intelligences can go FOOM) would not result in human extinction?
Questions about fractions of infinite sets require an enumeration strategy to be specified—or they don’t make much sense. Assuming lexicographic ordering of their source code—and only considering the set of superintelligent programs—no: I don’t mean to imply that.
I am not entirely sure I disagree with you. However, I am having difficulty modeling you.
“Achieving a goal” seems to mean, for our purposes, something along the lines of “Bringing about a world-state.” Most possible world-states do not involve human existence. Thus, it seems that for most possible goals, achieving a goal entails human extinction.
However, your mention of computer programs being produced by intelligent agents is interesting. Are you implying that most AGI’s (assume these intelligences can go FOOM) would not result in human extinction?
If this is not what you were implying, I apologize for modeling you poorly. If this is what you were implying, I would like to indicate that this post was non-hostile.
Questions about fractions of infinite sets require an enumeration strategy to be specified—or they don’t make much sense. Assuming lexicographic ordering of their source code—and only considering the set of superintelligent programs—no: I don’t mean to imply that.