What do you think DragonGod’s key point is? They haven’t argued against fast takeoff here. (Which is fine.) They seem to have misunderstood me as saying that no one who understands fast takeoff arguments would disagree that fast takeoff is likely, and then they’ve been defending their right to know about fast takeoff arguments and disagree that it’s likely.
I think a key point of DragonGod here is that the majority of the effort should go to scenarios that are likely to happen, and while fast takeoff deserves some effort, at this point it’s a mistake to expect Sam Altman to condition heavily on the fast takeoff, and not conditioning on it doesn’t make him irrational or ruled by incentives.
What do you think DragonGod’s key point is? They haven’t argued against fast takeoff here. (Which is fine.) They seem to have misunderstood me as saying that no one who understands fast takeoff arguments would disagree that fast takeoff is likely, and then they’ve been defending their right to know about fast takeoff arguments and disagree that it’s likely.
I think a key point of DragonGod here is that the majority of the effort should go to scenarios that are likely to happen, and while fast takeoff deserves some effort, at this point it’s a mistake to expect Sam Altman to condition heavily on the fast takeoff, and not conditioning on it doesn’t make him irrational or ruled by incentives.
It does if he hasn’t engaged with the arguments.