the world might be in such state that attempts to do good bring it into some failure instead, and doing the opposite is prevented by society (AI rise and blame-credit which rationality movement takes for it, perhaps?)
what if, for some numerical scale, the world would give you option “with 50%, double goodness score; otherwise, lose almost everything”? Maximizing EV on this is very dangerous...
Never say ‘nothing’ :-)
the world might be in such state that attempts to do good bring it into some failure instead, and doing the opposite is prevented by society
(AI rise and blame-credit which rationality movement takes for it, perhaps?)
what if, for some numerical scale, the world would give you option “with 50%, double goodness score; otherwise, lose almost everything”? Maximizing EV on this is very dangerous...