This being said, I still think that Eliezer’s reply succeeds.
There are a number of different messages being conveyed here. I agree that it looks like a success for at least one of them, but I’m worried about others.
I see Eliezer’s response to Holden’s challenge—“why do AGI at all?”—as: “Because you need FAI-grade skills to know if you need to do AGI or not.”
I agree with you that that is Eliezer’s strongest point. I am worried that it takes five thousand words to get across: that speaks to clarity and concision, but Holden is the one to ask about what his central point was, and so my worry shouldn’t be stronger than my model of Holden.
Though, I don’t know if “The world needs FAI-grade programmers, even if we just want to do Tool AI right now” carries through to “Invest in SIAI as a charity,” which is what Holden is ultimately interested in.
Agreed- and it looks like that agrees with Holden’s ultimate recommendation, of “SI should probably be funded at some level, but its current level seems too high.”
There are a number of different messages being conveyed here. I agree that it looks like a success for at least one of them, but I’m worried about others.
I agree with you that that is Eliezer’s strongest point. I am worried that it takes five thousand words to get across: that speaks to clarity and concision, but Holden is the one to ask about what his central point was, and so my worry shouldn’t be stronger than my model of Holden.
Agreed- and it looks like that agrees with Holden’s ultimate recommendation, of “SI should probably be funded at some level, but its current level seems too high.”