Kudos, Luke! It’s nice to see some serious applied rationality at the executive levels of SI.
One thing I’d love to see is some more risk analysis, which seems natural, given that SI is all about x-risk. What can go wrong in the next year? What are the odds? What would be the consequences if a certain risk is left unmitigated? What would such mitigation look like, how much would it cost and what would be the odds of its success? What potential risks are existential risks for SI as an organization (and can you list them without flinching away)?
Kudos, Luke! It’s nice to see some serious applied rationality at the executive levels of SI.
One thing I’d love to see is some more risk analysis, which seems natural, given that SI is all about x-risk. What can go wrong in the next year? What are the odds? What would be the consequences if a certain risk is left unmitigated? What would such mitigation look like, how much would it cost and what would be the odds of its success? What potential risks are existential risks for SI as an organization (and can you list them without flinching away)?
In general, next year is too soon. What the risks are next year will be affected most by what you did five or more years ago.