Rationality and common sense might be bad for your chances of achieving something great, because you need to irrationally believe that it’s possible at all.
That is true.
If you want to achieve something great, don’t be a skeptic about it. Be utterly idealistic.
Well, umm… there is the slight issue of cost. If you are deliberately choosing a high-risk strategy to give yourself a chance of a huge payoff, you need to realize that the mode of outcomes is you failing. Convincing yourself that you are destined to become a famous actress does improve your chances of getting into the movies, but most people who believe this will end up as waitresses in LA.
It’s like “If you want to become a millionaire, you need to buy lottery tickets” :-/
Agreed on all points. It could still make sense to adopt high-risk beliefs if you’ve already decided that you want to work on something, and the expected payoff outweighs the cost. Friendly AI development might be one such area.
That is true.
Well, umm… there is the slight issue of cost. If you are deliberately choosing a high-risk strategy to give yourself a chance of a huge payoff, you need to realize that the mode of outcomes is you failing. Convincing yourself that you are destined to become a famous actress does improve your chances of getting into the movies, but most people who believe this will end up as waitresses in LA.
It’s like “If you want to become a millionaire, you need to buy lottery tickets” :-/
Yeah. I actually wrote a post about that :-)
Agreed on all points. It could still make sense to adopt high-risk beliefs if you’ve already decided that you want to work on something, and the expected payoff outweighs the cost. Friendly AI development might be one such area.