Right, that isn’t an exhaustive list. I included the candidates which seemed most likely.
So, I think superintelligence is unlikely in general- but so is current civilization. I think superintelligences have a high occurrence rate given current civilization (for lots of reasons), which also means that current civilization isn’t that much more likely than superintelligence. It’s more justified to say “Superintelligences which make human minds” have a super low occurrence rate relative to natural examples of me and my environment, but that still seems to be an unlikely explanation.
Based on the “standard” discussion on this topic, I get the distinct impression that the probability our civilization will construct an aligned superintelligence is significantly greater than, for example, 10^-20%, and the large amounts of leverage that a superintelligence would have (There’s lots of matter out there) would produce this same effect.
Right, that isn’t an exhaustive list. I included the candidates which seemed most likely.
So, I think superintelligence is unlikely in general- but so is current civilization. I think superintelligences have a high occurrence rate given current civilization (for lots of reasons), which also means that current civilization isn’t that much more likely than superintelligence. It’s more justified to say “Superintelligences which make human minds” have a super low occurrence rate relative to natural examples of me and my environment, but that still seems to be an unlikely explanation.
Based on the “standard” discussion on this topic, I get the distinct impression that the probability our civilization will construct an aligned superintelligence is significantly greater than, for example, 10^-20%, and the large amounts of leverage that a superintelligence would have (There’s lots of matter out there) would produce this same effect.