The normal control problem assumes that no specific agency in the programs (especially not super-intelligent agency)
There seems to be a verb missing in that sentence...did you mean …assumes that there is no specific agency in the programs...?
(Nitpicks aside, I think this is the right approach...build current safety and control knowledge, rather than assume thjat all furure AIs will follow some very specific decision theory).
There seems to be a verb missing in that sentence...did you mean …assumes that there is no specific agency in the programs...?
(Nitpicks aside, I think this is the right approach...build current safety and control knowledge, rather than assume thjat all furure AIs will follow some very specific decision theory).
Thanks. Edited.