I guess I will break my recently self-imposed rule of not talking about this anymore.
I can certainly envision a future where multiple powerful AGIs fight against each other and are used as weapons, some might be rogue AGIs and some others might be at the service of human-controlled institutions (such as Nation Estates). To put it more clearly: I have trouble imagining a future where something along these lines DOES NOT end up happening.
But, this is NOT what Eliezer is saying. Eliezer is saying:
The Alignment problem has to be solved AT THE FIRST TRY because once you create this AGI we are dead in a matter of days (maybe weeks/months, it does not matter). If someone thinks that Eliezer is saying something else, I think they are not listening properly. Eliezer can have many flaws but lack of clarity is not one of them.
In general, I think this is a textbook example of the Motte and Baley fallacy. The Motte is: AGI can be dangerous, AGI will kill people, AGI will be very powerful. The Baley is: AGI creation means the imminent destruction of all human life and therefore we need to stop now all developments.
I never discussed the Motte. I do agree with that.
FYI I upvoted your most recent comment, but downvoted your previous few in this thread. Your most recent comment seemed to do a good job spelling out your position and gesturing at your crux. My guess is maybe other people were just tired of the discussion and downvoting sort of to make the whole discussion go away.
I guess I will break my recently self-imposed rule of not talking about this anymore.
I can certainly envision a future where multiple powerful AGIs fight against each other and are used as weapons, some might be rogue AGIs and some others might be at the service of human-controlled institutions (such as Nation Estates). To put it more clearly: I have trouble imagining a future where something along these lines DOES NOT end up happening.
But, this is NOT what Eliezer is saying. Eliezer is saying:
The Alignment problem has to be solved AT THE FIRST TRY because once you create this AGI we are dead in a matter of days (maybe weeks/months, it does not matter). If someone thinks that Eliezer is saying something else, I think they are not listening properly. Eliezer can have many flaws but lack of clarity is not one of them.
In general, I think this is a textbook example of the Motte and Baley fallacy. The Motte is: AGI can be dangerous, AGI will kill people, AGI will be very powerful. The Baley is: AGI creation means the imminent destruction of all human life and therefore we need to stop now all developments.
I never discussed the Motte. I do agree with that.
I would certainly appreciate knowing the reason for the downvotes
FYI I upvoted your most recent comment, but downvoted your previous few in this thread. Your most recent comment seemed to do a good job spelling out your position and gesturing at your crux. My guess is maybe other people were just tired of the discussion and downvoting sort of to make the whole discussion go away.