I don’t think that a fair assessment of what they said. They cite their years as evidence that they witnessed multiple doomsday predictions that turned out wrong. That’s a fine point.
I witnessed them as well, and they don’t move my needle back on the dangers of AI. Referring to them is pure outside view, when what is needed here is inside view, because when no-one does that, no-one does the actual work.
Actually I fully agree with that. I just have the impression that your choice of words suggested that Dave was being lazy or not fully honest, and I would disagree with that. I think he’s probably honestly laying his best arguments for what he truly believes.
I certainly wasn’t intending any implication of dishonesty. As for laziness, well, we all have our own priorities. Despite taking the AGI threat more seriously than Dave Lindbergh, I am not actually doing any more about it than he is (presumably nothing), as I find myself baffled to have any practical ideas of addressing it.
FWIW, I didn’t say anything about how seriously I take the AGI threat—I just said we’re not doomed. Meaning we don’t all die in 100% of future worlds.
I didn’t exclude, say, 99%.
I do think AGI is seriously fucking dangerous and we need to be very very careful, and that the probability of it killing us all is high enough to be really worried about.
What I did try to say is that if someone wants to be convinced we’re doomed (== 100%), then they want to put themselves in a situation where they believe nothing anyone does can improve our chances. And that leads to apathy and worse chances.
I don’t think that a fair assessment of what they said. They cite their years as evidence that they witnessed multiple doomsday predictions that turned out wrong. That’s a fine point.
I witnessed them as well, and they don’t move my needle back on the dangers of AI. Referring to them is pure outside view, when what is needed here is inside view, because when no-one does that, no-one does the actual work.
Actually I fully agree with that. I just have the impression that your choice of words suggested that Dave was being lazy or not fully honest, and I would disagree with that. I think he’s probably honestly laying his best arguments for what he truly believes.
I certainly wasn’t intending any implication of dishonesty. As for laziness, well, we all have our own priorities. Despite taking the AGI threat more seriously than Dave Lindbergh, I am not actually doing any more about it than he is (presumably nothing), as I find myself baffled to have any practical ideas of addressing it.
FWIW, I didn’t say anything about how seriously I take the AGI threat—I just said we’re not doomed. Meaning we don’t all die in 100% of future worlds.
I didn’t exclude, say, 99%.
I do think AGI is seriously fucking dangerous and we need to be very very careful, and that the probability of it killing us all is high enough to be really worried about.
What I did try to say is that if someone wants to be convinced we’re doomed (== 100%), then they want to put themselves in a situation where they believe nothing anyone does can improve our chances. And that leads to apathy and worse chances.
So, a dereliction of duty.