I’m very new to Less Wrong in general, and to Eliezer’s writing in particular, so I have a newbie question.
any more than you’ve ever argued that “we have to take AGI risk seriously even if there’s only a tiny chance of it” or similar crazy things that other people hallucinate you arguing.
just like how people who helpfully try to defend MIRI by saying “Well, but even if there’s a tiny chance...” are not thereby making their epistemic sins into mine.
I’ve read AGI Ruin: A List of Lethalities, and I legitimately have no idea what is wrong with “we have to take AGI risk seriously even if there’s only a tiny chance of it”. What is wrong with it? If anything, this seems like something I would say if I had to explain the gist of AGI Ruin: A List of Lethalities to someone else very briefly and using very few words.
The fact that I have absolutely no clue what is wrong with it probably means that I’m still very far from understanding anything about AGI and Eliezer’s position.
I’m very new to Less Wrong in general, and to Eliezer’s writing in particular, so I have a newbie question.
I’ve read AGI Ruin: A List of Lethalities, and I legitimately have no idea what is wrong with “we have to take AGI risk seriously even if there’s only a tiny chance of it”. What is wrong with it? If anything, this seems like something I would say if I had to explain the gist of AGI Ruin: A List of Lethalities to someone else very briefly and using very few words.
The fact that I have absolutely no clue what is wrong with it probably means that I’m still very far from understanding anything about AGI and Eliezer’s position.
List of Lethalities isn’t telling you “There’s a small chance of this.” It’s saying, “This will kill us. We’re all walking dead. I’m sorry.”
Ok, thank you for the clarification!