What Eliezer said. I was arguing from the assumption that he is wrong about FAI and stuff. If he’s right about the object level, then he’s not deluded in considering himself important.
I was arguing from the assumption that he is wrong about FAI and stuff. If he’s right about the object level, then he’s not deluded in considering himself important.
But if he is wrong about FAI and stuff, then he is still deluded not specifically about considering himself important, that implication is correct, he is deluded about FAI and stuff.
How so? Eliezer’s thesis is “AGI is dangerous and FAI is possible”. If he’s wrong—if AGI poses no danger or FAI is impossible—then what do you need a Frodo for?
Edited the grandparent to disambiguate the context.
(I haven’t discussed that particular thesis of Eliezer’s and nor does doubting that particular belief seem to be a take home message from multi’s post. The great grandparent is just a straightforward answer to the paragraph it quotes.)
What Eliezer said. I was arguing from the assumption that he is wrong about FAI and stuff. If he’s right about the object level, then he’s not deluded in considering himself important.
But if he is wrong about FAI and stuff, then he is still deluded not specifically about considering himself important, that implication is correct, he is deluded about FAI and stuff.
Agreed.
Which, of course, would still leave the second two dot points as answers to your question.
How so? Eliezer’s thesis is “AGI is dangerous and FAI is possible”. If he’s wrong—if AGI poses no danger or FAI is impossible—then what do you need a Frodo for?
Edited the grandparent to disambiguate the context.
(I haven’t discussed that particular thesis of Eliezer’s and nor does doubting that particular belief seem to be a take home message from multi’s post. The great grandparent is just a straightforward answer to the paragraph it quotes.)