So if you had $10 trillion, what would you do with it?
My worries, in order of priority, would be:
Someone manipulating/forcing me into giving substantial amount of money away. After all, my decision-making process is the weakest link here.
Existential risks.
I don’t know what I’d do for 1. and I won’t waste my time thinking about proper course of action for such low-probability scenarios. For 2. I’d hire all AI researchers to date to work under Eliezer and start seriously studying to be able to evaluate myself whether flipping the “on” switch would result in a friendly singularity.
That could be more damaging than the creation of NASA. You would just suck up all programmers and leave none to do the more down-to-earth real life programming.
So if you had $10 trillion, what would you do with it?
My worries, in order of priority, would be:
Someone manipulating/forcing me into giving substantial amount of money away. After all, my decision-making process is the weakest link here.
Existential risks.
I don’t know what I’d do for 1. and I won’t waste my time thinking about proper course of action for such low-probability scenarios. For 2. I’d hire all AI researchers to date to work under Eliezer and start seriously studying to be able to evaluate myself whether flipping the “on” switch would result in a friendly singularity.
That could be more damaging than the creation of NASA. You would just suck up all programmers and leave none to do the more down-to-earth real life programming.