When you say academia looks like a clear win within 5-10 years, is that assuming “academia” means “starting a tenure-track job now?” If instead one is considering whether to begin a PhD program, for example, would you say that the clear win range is more like 10-15 years?
Also, how important is being at a top-20 institution? If the tenure track offer was instead from University of Nowhere, would you change your recommendation and say go to industry?
Would you agree that if the industry project you could work on is the one that will eventually build TAI (or be one of the leading builders, if there are multiple) then you have more influence from inside than from outside in academia?
When you say academia looks like a clear win within 5-10 years, is that assuming “academia” means “starting a tenure-track job now?” If instead one is considering whether to begin a PhD program, for example, would you say that the clear win range is more like 10-15 years?
Yes.
Also, how important is being at a top-20 institution? If the tenure track offer was instead from University of Nowhere, would you change your recommendation and say go to industry?
My cut-off was probably somewhere between top-50 and top-100, and I was prepared to go anywhere in the world. If I couldn’t make into top 100, I think I would definitely have reconsidered academia. If you’re ready to go anywhere, I think it makes it much easier to find somewhere with high EV (but might have to move up the risk/reward curve a lot).
Would you agree that if the industry project you could work on is the one that will eventually build TAI (or be one of the leading builders, if there are multiple) then you have more influence from inside than from outside in academia?
Yes. But ofc it’s hard to know if that’s the case. I also think TAI is a less important category for me than x-risk inducing AI.
Makes sense. I think we don’t disagree dramatically then.
I also think TAI is a less important category for me than x-risk inducing AI.
Also makes sense—just checking, does x-risk-inducing AI roughly match the concept of “AI-induced potential point of no return” or is it importantly different? It’s certainly less of a mouthful so if it means roughly the same thing maybe I’ll switch terms. :)
um sorta modulo a type error… risk is risk. It doesn’t mean the thing has happened (we need to start using some sort of phrase like “x-event” or something for that, I think).
I’ve started using the phrase “existential catastrophe” in my thinking about this; “x-catastrophe” doesn’t really have much of a ring to it though, so maybe we need something else that abbreviates better?
When you say academia looks like a clear win within 5-10 years, is that assuming “academia” means “starting a tenure-track job now?” If instead one is considering whether to begin a PhD program, for example, would you say that the clear win range is more like 10-15 years?
Also, how important is being at a top-20 institution? If the tenure track offer was instead from University of Nowhere, would you change your recommendation and say go to industry?
Would you agree that if the industry project you could work on is the one that will eventually build TAI (or be one of the leading builders, if there are multiple) then you have more influence from inside than from outside in academia?
Yes.
My cut-off was probably somewhere between top-50 and top-100, and I was prepared to go anywhere in the world. If I couldn’t make into top 100, I think I would definitely have reconsidered academia. If you’re ready to go anywhere, I think it makes it much easier to find somewhere with high EV (but might have to move up the risk/reward curve a lot).
Yes. But ofc it’s hard to know if that’s the case. I also think TAI is a less important category for me than x-risk inducing AI.
Makes sense. I think we don’t disagree dramatically then.
Also makes sense—just checking, does x-risk-inducing AI roughly match the concept of “AI-induced potential point of no return” or is it importantly different? It’s certainly less of a mouthful so if it means roughly the same thing maybe I’ll switch terms. :)
um sorta modulo a type error… risk is risk. It doesn’t mean the thing has happened (we need to start using some sort of phrase like “x-event” or something for that, I think).
I’ve started using the phrase “existential catastrophe” in my thinking about this; “x-catastrophe” doesn’t really have much of a ring to it though, so maybe we need something else that abbreviates better?