The analogy from historical evolution is the misalignment between human genes and human minds, where the rise of the latter did not result in extinction of the former. It plausibly could have, but that is not what we observe.
The analogy is that the human genes thing produces a thing (human minds) which wants stuff, but the stuff it wants is different from what what the human genes want. From my perspective you’re strawmanning and failing to track the discourse here to a sufficient degree that I’m bowing out.
The analogy is that the human genes thing produces a thing (human minds) which wants stuff, but the stuff it wants is different from what what the human genes want.
Not nearly different enough to prevent the human genes from getting what they want in excess.
If we apply your frame of the analogy to AGI, we have slightly misaligned AGI which doesn’t cause human extinction, and instead enormously amplifies our utility.
From my perspective you’re strawmanning and failing to track the discourse here to a sufficient degree that I’m bowing out
From my perspective you persistently ignore, misunderstand, or misrepresent my arguments, overfocus on pedantic details, and refuse to update or agree on basics.
The analogy is that the human genes thing produces a thing (human minds) which wants stuff, but the stuff it wants is different from what what the human genes want. From my perspective you’re strawmanning and failing to track the discourse here to a sufficient degree that I’m bowing out.
Not nearly different enough to prevent the human genes from getting what they want in excess.
If we apply your frame of the analogy to AGI, we have slightly misaligned AGI which doesn’t cause human extinction, and instead enormously amplifies our utility.
From my perspective you persistently ignore, misunderstand, or misrepresent my arguments, overfocus on pedantic details, and refuse to update or agree on basics.