To avoid misunderstanding, the kinds of “Pivotal Acts” you are talking about involve using an AGI to seize absolute power on a global scale. The “smallest” pivotal act given by Yudkowsky would still be considered an Act of War by any country affected.
The above is obvious to anyone who reads this forum but it’s worth emphasizing the magnitude of what is being discussed.
I understand your argument to be as follows:
A. A small group of humans gaining absolute control over the lightcone is bad (but better than a lot of other options).
B. But because it’s better than the other options, there is a moral imperative to commit a “pivotal act” if given the opportunity.
C. It is morally correct that this group then give up their power and safely hand it back to humanity.
I have two strongly held objections:
While A and C are correct, B overlooks the political and moral landscape we live in.
This post itself will influence researchers who admire you in a way that is harmful.
This argument overlooks the political and moral landscape we live in.
The political landscape:
Most (all?) groups of humans that have seized power and stated paternalistic intentions toward the subjugated people have abused that power in horrific ways. Everyone claims they’re trying to help the world and most humans genuinely believe it. We’re fantastic at spinning narratives that we’re the good guys. Nobody will trust your claim that you will give up power nor should they.
If any domestic agency was to take understand your intentions and believe you realistically had the capacity to carry them out you would be (at best) detained. Realistically, if a rogue group of researchers were close to completing an AGI with the goal of using it to take over to world, then nuclear weapons would be warranted.
There is also going to be a strong incentive to seize or steal the technology during development. The hardware required for the “good guys” to perform a pivotal act will be dual use. The “bad guys” can also use it to perform a pivotal act.
The ideal team of fantastic, highly moral scientists probably won’t be the ones who make the final decisions about what the future looks like.
The moral landscape: In the worlds where we have access to the global utility function, then seizing power to improve the world makes objective sense. In the world we actually live in, if you find yourself wanting to seize power to improve the world, there’s a good chance you’re closer to a mad scientist (though a well intentioned one).
I don’t know of any human (or group of humans) on the planet I would trust to actually give up absolute power.
This post itself will influence researchers who admire you in a way that is harmful.
This post serves to spread the idea that a unilateral pivotal act is inevitably the only realistic way to save humanity. But by writing this post, you’re driving up closer to that world by discouraging people from looking into alternatives.
Preliminary:
To avoid misunderstanding, the kinds of “Pivotal Acts” you are talking about involve using an AGI to seize absolute power on a global scale. The “smallest” pivotal act given by Yudkowsky would still be considered an Act of War by any country affected.
The above is obvious to anyone who reads this forum but it’s worth emphasizing the magnitude of what is being discussed.
I understand your argument to be as follows:
A. A small group of humans gaining absolute control over the lightcone is bad (but better than a lot of other options).
B. But because it’s better than the other options, there is a moral imperative to commit a “pivotal act” if given the opportunity.
C. It is morally correct that this group then give up their power and safely hand it back to humanity.
I have two strongly held objections:
While A and C are correct, B overlooks the political and moral landscape we live in.
This post itself will influence researchers who admire you in a way that is harmful.
This argument overlooks the political and moral landscape we live in.
The political landscape:
Most (all?) groups of humans that have seized power and stated paternalistic intentions toward the subjugated people have abused that power in horrific ways. Everyone claims they’re trying to help the world and most humans genuinely believe it. We’re fantastic at spinning narratives that we’re the good guys. Nobody will trust your claim that you will give up power nor should they.
If any domestic agency was to take understand your intentions and believe you realistically had the capacity to carry them out you would be (at best) detained. Realistically, if a rogue group of researchers were close to completing an AGI with the goal of using it to take over to world, then nuclear weapons would be warranted.
There is also going to be a strong incentive to seize or steal the technology during development. The hardware required for the “good guys” to perform a pivotal act will be dual use. The “bad guys” can also use it to perform a pivotal act.
The ideal team of fantastic, highly moral scientists probably won’t be the ones who make the final decisions about what the future looks like.
The moral landscape:
In the worlds where we have access to the global utility function, then seizing power to improve the world makes objective sense. In the world we actually live in, if you find yourself wanting to seize power to improve the world, there’s a good chance you’re closer to a mad scientist (though a well intentioned one).
I don’t know of any human (or group of humans) on the planet I would trust to actually give up absolute power.
This post itself will influence researchers who admire you in a way that is harmful.
This post serves to spread the idea that a unilateral pivotal act is inevitably the only realistic way to save humanity. But by writing this post, you’re driving up closer to that world by discouraging people from looking into alternatives.