I don’t believe that it’s mainstream transhumanist thought, in part because most people who’d call themselves transhumanists have not been exposed to the relevant arguments.
Does that help? No?
The problem with this vision of the future is that it’s nearly basilisk-like in its horror. As you said, you had a panic attack; others will reject it out of pure denial that things can be this bad, or perform motivated cognition to find reasons why it won’t actually happen. What I’ve never seen is a good rebuttal.
If it’s any consolation, I don’t think the possibility really makes things that much worse. It constrains FAI design a little more, perhaps, but the no-FAI futures already looked pretty bleak. A good FAI will avoid this scenario right along with all the ones we haven’t thought of yet.
I don’t believe that it’s mainstream transhumanist thought, in part because most people who’d call themselves transhumanists have not been exposed to the relevant arguments.
Does that help? No?
The problem with this vision of the future is that it’s nearly basilisk-like in its horror. As you said, you had a panic attack; others will reject it out of pure denial that things can be this bad, or perform motivated cognition to find reasons why it won’t actually happen. What I’ve never seen is a good rebuttal.
If it’s any consolation, I don’t think the possibility really makes things that much worse. It constrains FAI design a little more, perhaps, but the no-FAI futures already looked pretty bleak. A good FAI will avoid this scenario right along with all the ones we haven’t thought of yet.
The writer did seem to think that it was very likely. But he dismisses the idea of FAI being a singleton.