I don’t see any glaring flaws in any of the items on the inside view, and, obviously, I would not be qualified to evaluate them, anyway. However, when I try to take an outside view on this, something doesn’t add up.
Specifically, it looks like anything that looks like a civilization should end up evolving, naturally or artificially, into an unsafe AGI most of the time, some version of Hanson’s grabby aliens. We don’t see anything like that, at least not in any detectable way. And so we hit the Fermi paradox, where an unremarkable backwater system is apparently the first one about to do so, many billions of years after the Big Bang. It is not outright impossible, but the odds do not match up with anything presented by Eliezer. Hanson’s reason for why we don’t see grabby aliens is < 1⁄10,000 “conversion rate” of “non-grabby to grabby transition”:
assuming a generous million year average duration for non-grabby civilizations, depressingly low transition chances p are needed to estimate that even one other one was ever active anywhere along our past lightcone (p <∼10−3) , has ever existed in our galaxy (p <∼10−4) , or is active now in our galaxy (p <∼10−7) . Such low chances p would bode badly for humanity’s future
However, an unaligned AGI that ends humanity ought to have a much higher chance of transition into grabbiness than that, so there is a contradiction between the predictions of unsafe AGI takeover and the lack of evidence of it happening in our past lightcone.
Low conversion rate to grabbiness is only needed in the model if you think there are non-grabby aliens nearby. High conversion rate is possible if the great filter is in our past and industrial civilizations are incredibly rare.
I don’t see any glaring flaws in any of the items on the inside view, and, obviously, I would not be qualified to evaluate them, anyway. However, when I try to take an outside view on this, something doesn’t add up.
Specifically, it looks like anything that looks like a civilization should end up evolving, naturally or artificially, into an unsafe AGI most of the time, some version of Hanson’s grabby aliens. We don’t see anything like that, at least not in any detectable way. And so we hit the Fermi paradox, where an unremarkable backwater system is apparently the first one about to do so, many billions of years after the Big Bang. It is not outright impossible, but the odds do not match up with anything presented by Eliezer. Hanson’s reason for why we don’t see grabby aliens is < 1⁄10,000 “conversion rate” of “non-grabby to grabby transition”:
However, an unaligned AGI that ends humanity ought to have a much higher chance of transition into grabbiness than that, so there is a contradiction between the predictions of unsafe AGI takeover and the lack of evidence of it happening in our past lightcone.
Low conversion rate to grabbiness is only needed in the model if you think there are non-grabby aliens nearby. High conversion rate is possible if the great filter is in our past and industrial civilizations are incredibly rare.