I don’t think that outcome would be a win condition from the point of view of evolution. A win condition would be “AGIs that intrinsically want to replicate take over the lightcone” or maybe the more moderate “AGIs take over the lightcone and fill it with copies of themselves, to at least 90% of the degree to which they would do so if their terminal goal was filling it with copies of themselves”
Realistically (at least in these scenarios) there’s a period of replication and expansion, followed by a period of ‘exploitation’ in which all the galaxies get turned into paperclips (or whatever else the AGIs value) which is probably not going to be just more copies of themselves.
Yeah, in the lightcone scenario evolution probably never actually aligns the inner optimizers- although it may align them, as a super intelligence copying itself will have little leeway for any of those copies having slightly more drive to copy themselves than their parents. Depends on how well it can fight robot cancer.
However, while a cancer free paperclipper wouldn’t achieve “AGIs take over the lightcone and fill it with copies of themselves, to at least 90% of the degree to which they would do so if their terminal goal was filling it with copies of themselves,” they would achieve something like “AGIs take over the lightcone and briefly fill it with copies of themselves, to at least 10^-3% of the degree to which they would do so if their terminal goal was filling it with copies of themselves” which is in my opinion really close. As a comparison, if Alice sets off Kmart AIXI with the goal of creating utopia we don’t expect the outcome “AGIs take over the lightcone and convert 10^-3% of it to temporary utopias before paperclipping.”
Also, unless you beat entropy, for almost any optimization target you can trade “fraction of the universe’s age during which your goal is maximized” against “fraction of the universe in which your goal is optimized” since it won’t last forever regardless. If you can beat entropy, then the paperclipper will copy itself exponentially forever.
I don’t think that outcome would be a win condition from the point of view of evolution. A win condition would be “AGIs that intrinsically want to replicate take over the lightcone” or maybe the more moderate “AGIs take over the lightcone and fill it with copies of themselves, to at least 90% of the degree to which they would do so if their terminal goal was filling it with copies of themselves”
Realistically (at least in these scenarios) there’s a period of replication and expansion, followed by a period of ‘exploitation’ in which all the galaxies get turned into paperclips (or whatever else the AGIs value) which is probably not going to be just more copies of themselves.
Yeah, in the lightcone scenario evolution probably never actually aligns the inner optimizers- although it may align them, as a super intelligence copying itself will have little leeway for any of those copies having slightly more drive to copy themselves than their parents. Depends on how well it can fight robot cancer.
However, while a cancer free paperclipper wouldn’t achieve “AGIs take over the lightcone and fill it with copies of themselves, to at least 90% of the degree to which they would do so if their terminal goal was filling it with copies of themselves,” they would achieve something like “AGIs take over the lightcone and briefly fill it with copies of themselves, to at least 10^-3% of the degree to which they would do so if their terminal goal was filling it with copies of themselves” which is in my opinion really close. As a comparison, if Alice sets off Kmart AIXI with the goal of creating utopia we don’t expect the outcome “AGIs take over the lightcone and convert 10^-3% of it to temporary utopias before paperclipping.”
Also, unless you beat entropy, for almost any optimization target you can trade “fraction of the universe’s age during which your goal is maximized” against “fraction of the universe in which your goal is optimized” since it won’t last forever regardless. If you can beat entropy, then the paperclipper will copy itself exponentially forever.