Eliezer; it sounds like one of the most critical parts of Friendliness is stopping the AI having nightmares! Blocking a self-improving AI from most efficiently mapping anything with consciousness or qualia, ever, without it knowing first hand what they are? Checking it doesn’t happen by accident in any process?
I’m glad it’s you doing this. It seems many people are only really bothered by virtual unpleasantness if it’s to simulated people.
Eliezer; it sounds like one of the most critical parts of Friendliness is stopping the AI having nightmares! Blocking a self-improving AI from most efficiently mapping anything with consciousness or qualia, ever, without it knowing first hand what they are? Checking it doesn’t happen by accident in any process?
I’m glad it’s you doing this. It seems many people are only really bothered by virtual unpleasantness if it’s to simulated people.