Drexler’s vision of comprehensive AI services is a beautiful fantasy IMO. Agents are powerful. There will be plenty of AI services, yes, but there will also be AI agents, and those are what we are worried about.
Yet your definition of agent here is sufficiently different from what Drexler is trying to avoid that I don’t see whether this post disagrees with Drexler.
Drexler is focused on avoiding agents whose goals are defined over large parts of our spacetime. An agent that cares only about a distant part of Tegmark IV is different from an agent that cares how many paperclips this world has. And there are important cases in between these possibilities that are harder to evaluate.
Can anyone suggest a better label for what Drexler is trying to avoid?
Huh, that’s surprising to me, that’s not how I interpret Drexler. Isn’t an agent that wants to maximize paperclips in our world only… still going to take over and transform the earth into paperclips if it can?
Ah, I see—you think an agent that cares only about a distant part of Tegmark IV wouldn’t take over even if it could. I disagree, or at least think it isn’t obvious and more details need to be specified.
You commented here:
Yet your definition of agent here is sufficiently different from what Drexler is trying to avoid that I don’t see whether this post disagrees with Drexler.
Drexler is focused on avoiding agents whose goals are defined over large parts of our spacetime. An agent that cares only about a distant part of Tegmark IV is different from an agent that cares how many paperclips this world has. And there are important cases in between these possibilities that are harder to evaluate.
Can anyone suggest a better label for what Drexler is trying to avoid?
Huh, that’s surprising to me, that’s not how I interpret Drexler. Isn’t an agent that wants to maximize paperclips in our world only… still going to take over and transform the earth into paperclips if it can?
Yes, that’s my example of a dangerous agent with broadly defined goals. The goals cover an unbounded amount of our world.
Ah, I see—you think an agent that cares only about a distant part of Tegmark IV wouldn’t take over even if it could. I disagree, or at least think it isn’t obvious and more details need to be specified.