I’m curious how this fits into the context. Regardless of whether or not one believes it’s true, doesn’t it seem reasonable and intuitively right—so the opposite of what is asked for?
I think the argument that would have been seen as ridiculous by most in the nuclear weapons example is, “The right arrangement of (essentially, what look like) rocks and metals will not only make a big explosion, but could destroy all life on earth in seconds.” The argument in favor (and eventual, correct argument against) were both highly technical and inaccessible. Also the people most involved in the deep technical weeds were both the ones capable of seeing the danger, and the ones needed to figure out if the danger was real or not.
So it would be:
Claim: A nuclear bomb could set the atmosphere on fire and destroy everything on earth
Argument: Someone did a calculation.
Counterargument: Clearly, that’s absurd.
Good Counterargument: Someone else did another calculation.
And I guess the analogy to AI applies foom/room a the bottom, where one can actually do calculations to at least in principle estimate some OOMs.
I’m curious how this fits into the context. Regardless of whether or not one believes it’s true, doesn’t it seem reasonable and intuitively right—so the opposite of what is asked for?
I think the argument that would have been seen as ridiculous by most in the nuclear weapons example is, “The right arrangement of (essentially, what look like) rocks and metals will not only make a big explosion, but could destroy all life on earth in seconds.” The argument in favor (and eventual, correct argument against) were both highly technical and inaccessible. Also the people most involved in the deep technical weeds were both the ones capable of seeing the danger, and the ones needed to figure out if the danger was real or not.
So it would be: Claim: A nuclear bomb could set the atmosphere on fire and destroy everything on earth Argument: Someone did a calculation. Counterargument: Clearly, that’s absurd. Good Counterargument: Someone else did another calculation.
And I guess the analogy to AI applies foom/room a the bottom, where one can actually do calculations to at least in principle estimate some OOMs.