by frequently praising his own expertise, or appealing to a fictional utopia of fictional geniuses who agree that you’re an idiot or wrong[1])
This part in particular is easily one of the most problematic things I see Yudkowsky do, because a fictional world can be almost arbitrarily different from our world, and thus lessons from a fictional world often fail to generalize (and that’s conditioning on it being logically coherent), so there’s very little reason to do this unless you are very careful, and at that point, you could just focus on the lessons of our own world’s history.
Even when I mention stuff like halting oracles, which are almost certainly not possible in the world we live in, I don’t make the mistake of thinking that halting oracles can give us insights for our own world, because our world and the world where we can compute the halting problem are so different as to make a lot of lessons non-transferable (I’m referring to a recent discussion in discord here).
There are good reasons why we should mostly not use fiction to inform real-world beliefs very much, courtesy of Eliezer Yudkowsky himself:
This part in particular is easily one of the most problematic things I see Yudkowsky do, because a fictional world can be almost arbitrarily different from our world, and thus lessons from a fictional world often fail to generalize (and that’s conditioning on it being logically coherent), so there’s very little reason to do this unless you are very careful, and at that point, you could just focus on the lessons of our own world’s history.
Even when I mention stuff like halting oracles, which are almost certainly not possible in the world we live in, I don’t make the mistake of thinking that halting oracles can give us insights for our own world, because our world and the world where we can compute the halting problem are so different as to make a lot of lessons non-transferable (I’m referring to a recent discussion in discord here).
There are good reasons why we should mostly not use fiction to inform real-world beliefs very much, courtesy of Eliezer Yudkowsky himself:
https://www.lesswrong.com/posts/rHBdcHGLJ7KvLJQPk/the-logical-fallacy-of-generalization-from-fictional