Yes, I really do. I’m afraid I can’t talk about all of the reasons for this (I work at OpenAI) but mostly it should be figure-outable from publicly available information. My timelines were already fairly short (2029 median) when I joined OpenAI in early 2022, and things have gone mostly as I expected. I’ve learned a bunch of stuff some of which updated me upwards and some of which updated me downwards.
As for the 15% − 15% thing: I mean I don’t feel confident that those are the right numbers; rather, those numbers express my current state of uncertainty. I could see the case for making the 2024 number higher than the 2025 (exponential distribution vibes, ‘if it doesn’t work now then that’s evidence it won’t work next year either’ vibes.) I could also see the case for making the 2025 number higher (it seems like it’ll happen this year, but in general projects usually take twice as long as one expects due to the planning fallacy, therefore it’ll probably happen next year)
Yes, I really do. I’m afraid I can’t talk about all of the reasons for this (I work at OpenAI) but mostly it should be figure-outable from publicly available information. My timelines were already fairly short (2029 median) when I joined OpenAI in early 2022, and things have gone mostly as I expected. I’ve learned a bunch of stuff some of which updated me upwards and some of which updated me downwards.
As for the 15% − 15% thing: I mean I don’t feel confident that those are the right numbers; rather, those numbers express my current state of uncertainty. I could see the case for making the 2024 number higher than the 2025 (exponential distribution vibes, ‘if it doesn’t work now then that’s evidence it won’t work next year either’ vibes.) I could also see the case for making the 2025 number higher (it seems like it’ll happen this year, but in general projects usually take twice as long as one expects due to the planning fallacy, therefore it’ll probably happen next year)