Huh, that’s surprising to me, that’s not how I interpret Drexler. Isn’t an agent that wants to maximize paperclips in our world only… still going to take over and transform the earth into paperclips if it can?
Ah, I see—you think an agent that cares only about a distant part of Tegmark IV wouldn’t take over even if it could. I disagree, or at least think it isn’t obvious and more details need to be specified.
Huh, that’s surprising to me, that’s not how I interpret Drexler. Isn’t an agent that wants to maximize paperclips in our world only… still going to take over and transform the earth into paperclips if it can?
Yes, that’s my example of a dangerous agent with broadly defined goals. The goals cover an unbounded amount of our world.
Ah, I see—you think an agent that cares only about a distant part of Tegmark IV wouldn’t take over even if it could. I disagree, or at least think it isn’t obvious and more details need to be specified.