So, are you suggesting that Robin Hanson (who is on record as not buying the Scary Idea) -- the current owner of the Overcoming Bias blog, and Eli’s former collaborator on this blog—fails to buy the Scary Idea “due to cognitive biases that are hard to overcome.” I find that a bit ironic.
Like Robin and Eli and perhaps yourself, I’ve read the heuristics and biases literature also. I’m not so naive as to make judgments about huge issues, that I think about for years of my life, based strongly on well-known cognitive biases.
It seems more plausible to me to assert that many folks who believe the Scary Idea, are having their judgment warped by plain old EMOTIONAL bias—i.e. stuff like “fear of the unknown”, and “the satisfying feeling of being part a self-congratulatory in-crowd that thinks it understands the world better than everyone else”, and the well known “addictive chemical high of righteous indignation”, etc.
Regarding your final paragraph: Is your take on the debate between Robin and Eli about “Foom” that all Robin was saying boils down to “la la la I can’t hear you” ? If so I would suggest that maybe YOU are the one with the (metaphorical) hearing problem ;p ….
I think there’s a strong argument that: “The truth value of “Once an AGI is at the level of a smart human computer scientist, hard takeoff is likely” is significantly above zero.” No assertion stronger than that seems to me to be convincingly supported by any of the arguments made on Less Wrong or Overcoming Bias or any of Eli’s prior writings.
Personally, I actually do strongly suspect that once an AGI reaches that level, a hard takeoff is extremely likely unless the AGI has been specifically inculcated with goal content working against this. But I don’t claim to have a really compelling argument for this. I think we need a way better theory of AGI before we can frame such arguments compellingly. And I think that theory is going to emerge after we’ve experimented with some AGI systems that are fairly advanced, yet well below the “smart computer scientist” level.
So, are you suggesting that Robin Hanson (who is on record as not buying the Scary Idea) -- the current owner of the Overcoming Bias blog, and Eli’s former collaborator on this blog—fails to buy the Scary Idea “due to cognitive biases that are hard to overcome.” I find that a bit ironic.
Like Robin and Eli and perhaps yourself, I’ve read the heuristics and biases literature also. I’m not so naive as to make judgments about huge issues, that I think about for years of my life, based strongly on well-known cognitive biases.
It seems more plausible to me to assert that many folks who believe the Scary Idea, are having their judgment warped by plain old EMOTIONAL bias—i.e. stuff like “fear of the unknown”, and “the satisfying feeling of being part a self-congratulatory in-crowd that thinks it understands the world better than everyone else”, and the well known “addictive chemical high of righteous indignation”, etc.
Regarding your final paragraph: Is your take on the debate between Robin and Eli about “Foom” that all Robin was saying boils down to “la la la I can’t hear you” ? If so I would suggest that maybe YOU are the one with the (metaphorical) hearing problem ;p ….
I think there’s a strong argument that: “The truth value of “Once an AGI is at the level of a smart human computer scientist, hard takeoff is likely” is significantly above zero.” No assertion stronger than that seems to me to be convincingly supported by any of the arguments made on Less Wrong or Overcoming Bias or any of Eli’s prior writings.
Personally, I actually do strongly suspect that once an AGI reaches that level, a hard takeoff is extremely likely unless the AGI has been specifically inculcated with goal content working against this. But I don’t claim to have a really compelling argument for this. I think we need a way better theory of AGI before we can frame such arguments compellingly. And I think that theory is going to emerge after we’ve experimented with some AGI systems that are fairly advanced, yet well below the “smart computer scientist” level.