2nd half I liked more than the first. I think that AGI should not be mentioned in it—we do well enough by ourselves destroying ourselves and the habitat. By Occam’s razor thing AGI could serve as illustrational example of how we do it exactly.… But we do waaay less elegant.
For me it’s simple—either AGI emerges and takes control from us in ~10y or we are all dead in ~10y.
I believe that probability of some mind that comprehended and absorbed our cultures and histories and morals and ethics—chance of this mind becoming “unaligned” and behaving like one of those evil and nasty and stupid characters from our books and movies and plays he grew up reading… Dunno, it should be really small, no? Even if probability is 0.5, or even 0.9 - still we got 10% chance to survive...
With humans behind the wheel our chance is 0%. They can’t organize themselves to reduce insulating gases emissions! They can’t contain virus outbreaks! If covid was a bit deadlier—we’d be all dead by now...
I mean I can imagine some alien civ that is extremely evil and nasty would create AGI that initially would be also evil and nasty and kills them all… But such civs exist only in Tolkien books.
And I can imagine apes trying to solve human alignment in anticipation of humans arriving soon)))) Actually bingo! Solving AGI alignment—it could be a good candidate for one of those jobs for unemployed 50% to keep’em busy.
2nd half I liked more than the first. I think that AGI should not be mentioned in it—we do well enough by ourselves destroying ourselves and the habitat. By Occam’s razor thing AGI could serve as illustrational example of how we do it exactly.… But we do waaay less elegant.
For me it’s simple—either AGI emerges and takes control from us in ~10y or we are all dead in ~10y.
I believe that probability of some mind that comprehended and absorbed our cultures and histories and morals and ethics—chance of this mind becoming “unaligned” and behaving like one of those evil and nasty and stupid characters from our books and movies and plays he grew up reading… Dunno, it should be really small, no? Even if probability is 0.5, or even 0.9 - still we got 10% chance to survive...
With humans behind the wheel our chance is 0%. They can’t organize themselves to reduce insulating gases emissions! They can’t contain virus outbreaks! If covid was a bit deadlier—we’d be all dead by now...
I mean I can imagine some alien civ that is extremely evil and nasty would create AGI that initially would be also evil and nasty and kills them all… But such civs exist only in Tolkien books.
And I can imagine apes trying to solve human alignment in anticipation of humans arriving soon)))) Actually bingo! Solving AGI alignment—it could be a good candidate for one of those jobs for unemployed 50% to keep’em busy.