I suggest adding some more meta questions to the list.
What improvements can we make to the way we go about answering strategy questions? For example, should we differentiate between “strategic insights” (such as Carl Shulman’s insight that WBE-based Singletons may be feasible) and “keeping track of the big picture” (forming the overall strategy and updating it based on new insights and evidence), and aim to have people specialize in each, so that people deciding strategy won’t be tempted to overweigh their own insights? Another example: is there a better way to combine probability estimates from multiple people?
How do people in other fields answer strategy questions? Is there such a thing as a science or art of strategy that we can copy from (and perhaps improve upon with ideas from x-rationality)?
Should the subject be called “AI safety strategies” or “Singularity strategies”? (I prefer the latter.)
I suggest adding some more meta questions to the list.
What improvements can we make to the way we go about answering strategy questions? For example, should we differentiate between “strategic insights” (such as Carl Shulman’s insight that WBE-based Singletons may be feasible) and “keeping track of the big picture” (forming the overall strategy and updating it based on new insights and evidence), and aim to have people specialize in each, so that people deciding strategy won’t be tempted to overweigh their own insights? Another example: is there a better way to combine probability estimates from multiple people?
How do people in other fields answer strategy questions? Is there such a thing as a science or art of strategy that we can copy from (and perhaps improve upon with ideas from x-rationality)?
Should the subject be called “AI safety strategies” or “Singularity strategies”? (I prefer the latter.)