So, I often have a nagging worry that what I’m working on only seems like it’s reducing existential risk after the best analysis I can do right now, but actually it’s increasing existential risk. That’s not a pleasant feeling, but it’s the kind of uncertainty you have to live with when working on these kinds of problems. All you can do is try really hard, and then try harder.
I was referencing how it is difficult to effectively lead an organization that is so focused on the distant future and which must make so many difficult decisions.
Oh! Well I feel stupid indeed. I thought that all the text after the sidenote was a quotation from Luke (which I would find at the link in said sidenote), rather than a continuation of Mike Darwin’s statement. I don’t know why I didn’t even consider the latter.
Additionally, the link in the OP is wrong. I followed it in hopes that Luke would provide a citation where I could see these estimates.
This was the quote I was referring to:
I was referencing how it is difficult to effectively lead an organization that is so focused on the distant future and which must make so many difficult decisions.
I should have been clearer.
Oh! Well I feel stupid indeed. I thought that all the text after the sidenote was a quotation from Luke (which I would find at the link in said sidenote), rather than a continuation of Mike Darwin’s statement. I don’t know why I didn’t even consider the latter.