In general, it might be best (though this is controversial) to think of surviving the advent of AGI as being like solving an engineering problem. You can’t solve an engineering problem just because the problem turned out to be weirder than you expected. It’s still an engineering problem, and almost all sets of actions still don’t lead to the outcomes you want.
In general, it might be best (though this is controversial) to think of surviving the advent of AGI as being like solving an engineering problem. You can’t solve an engineering problem just because the problem turned out to be weirder than you expected. It’s still an engineering problem, and almost all sets of actions still don’t lead to the outcomes you want.
https://intelligence.org/2018/10/03/rocket-alignment/