Yes. Intuition is valuable when used in concert with rational analysis, and perhaps that’s under-appreciated particularly in x-risk.
And: We should all be doing this for x-risk, in all directions. Just short of wasting time. If you’ve got nagging doubts about whether you’re spending your time properly, you should investigate them with your rational mind.
I work on x-risk full time. I have my model of what the risks are, and where the effort should go. My model has lots of uncertainty. I have spent serious time understanding Bayesian reasoning and statistics and other reasoning and statistics , logic, science, and even philosophy in case that helps (it does).
And I’d still guess that I’m not just wrong, but importantly wrong.
The situation surrounding x-risk and particularly AGI is highly complex. It’s so complex that I don’t think anyone with a merely human mind can claim to have finished their work on figuring out where their effort should go (and therefore where their emotional motivation should lie).
So what I think we can say is that strategic thinking is important (that is, figuring out what’s more likely to happen in the future, and therefore where your efforts should go in the present). This is an expansion of the point made here: I think you should do as the post suggests, and take seriously your intuitions surrounding x-risk.
Where I guess I disagree is the implication that some rational analysis can settle those nagging doubts. It can’t. The situation is too complex, and we’re all trying to think with monkey brains (with their nontrivial biases and severe limited resources).
But this type of thinking can help settle your doubts. Combining intuition with analysis gets us closer to the truth.
But there’s clearly such a thing as paying too much attention to nagging doubts. Weighing strategic analysis against getting object-level work done is a judgment call. We mustn’t work on strategy to the exclusion of doing actual work (much less simply fret or worry, that is, pay attention to nagging doubts without making real progress in resolving them).
I’d guess that even most people working on alignment x-risk are spending too little time on strategic analysis. But those people are self-selected for getting things done. There are other people who are excessively paralyzed with strategic analysis, who are paying too much attention to their nagging doubts (which probably pull in multiple directions).
One unique situation here is that most professions have already had most of the useful strategic analysis done. Sometimes that’s done by your predecessors; if you’re a plumber or electrician, people have tried a bunch of business models and techniques. In faster-developing fields, a hierarchical structure often designates who’s meant to do strategic analysis (management) and who’s supposed to sit down and get shit done.
Alignment and other x-risk fields have neither of those sources of strategic analyisis. We’re all mostly doing our own work, based on others’ public writings. This all takes time.
It’s time well spent, until it isn’t. Intuition and analysis are both needed, and when enough is enough is always a judgment call that itself requires intuition and analysis, once again emplyed in artful combination.
Yes. Intuition is valuable when used in concert with rational analysis, and perhaps that’s under-appreciated particularly in x-risk.
And: We should all be doing this for x-risk, in all directions. Just short of wasting time. If you’ve got nagging doubts about whether you’re spending your time properly, you should investigate them with your rational mind.
I work on x-risk full time. I have my model of what the risks are, and where the effort should go. My model has lots of uncertainty. I have spent serious time understanding Bayesian reasoning and statistics and other reasoning and statistics , logic, science, and even philosophy in case that helps (it does).
And I’d still guess that I’m not just wrong, but importantly wrong.
The situation surrounding x-risk and particularly AGI is highly complex. It’s so complex that I don’t think anyone with a merely human mind can claim to have finished their work on figuring out where their effort should go (and therefore where their emotional motivation should lie).
So what I think we can say is that strategic thinking is important (that is, figuring out what’s more likely to happen in the future, and therefore where your efforts should go in the present). This is an expansion of the point made here: I think you should do as the post suggests, and take seriously your intuitions surrounding x-risk.
Where I guess I disagree is the implication that some rational analysis can settle those nagging doubts. It can’t. The situation is too complex, and we’re all trying to think with monkey brains (with their nontrivial biases and severe limited resources).
But this type of thinking can help settle your doubts. Combining intuition with analysis gets us closer to the truth.
But there’s clearly such a thing as paying too much attention to nagging doubts. Weighing strategic analysis against getting object-level work done is a judgment call. We mustn’t work on strategy to the exclusion of doing actual work (much less simply fret or worry, that is, pay attention to nagging doubts without making real progress in resolving them).
I’d guess that even most people working on alignment x-risk are spending too little time on strategic analysis. But those people are self-selected for getting things done. There are other people who are excessively paralyzed with strategic analysis, who are paying too much attention to their nagging doubts (which probably pull in multiple directions).
One unique situation here is that most professions have already had most of the useful strategic analysis done. Sometimes that’s done by your predecessors; if you’re a plumber or electrician, people have tried a bunch of business models and techniques. In faster-developing fields, a hierarchical structure often designates who’s meant to do strategic analysis (management) and who’s supposed to sit down and get shit done.
Alignment and other x-risk fields have neither of those sources of strategic analyisis. We’re all mostly doing our own work, based on others’ public writings. This all takes time.
It’s time well spent, until it isn’t. Intuition and analysis are both needed, and when enough is enough is always a judgment call that itself requires intuition and analysis, once again emplyed in artful combination.