I don’t want to get super hung up on this because it’s not about anything Yudkowsky has said but:
Consider the whole transformed line of reasoning:
avian flight comes from a lot of factors; you can’t just ape one of the factors and expect the rest to follow; to get an entity which flies, that entity must be as close to a bird as birds are to each other.
IMO this is not a faithful transformation of the line of reasoning you attribute to Yudkowsky, which was:
human intelligence/alignment comes from a lot of factors; you can’t just ape one of the factors and expect the rest to follow; to get a mind which wants as humans do, that mind must be as close to a human as humans are to each other.
Specifically, where you wrote “an entity which flies”, you were transforming “a mind which wants as humans do”, which I think should instead be transformed to “an entity which flies as birds do”. And indeed planes don’t fly like birds do. [EDIT: two minutes or so after pressing enter on this comment, I now see how you could read it your way]
I guess if I had to make an analogy I would say that you have to be pretty similar to a human to think the way we do, but probably not to pursue the same ends, which is probably the point you cared about establishing.
I don’t want to get super hung up on this because it’s not about anything Yudkowsky has said but:
IMO this is not a faithful transformation of the line of reasoning you attribute to Yudkowsky, which was:
Specifically, where you wrote “an entity which flies”, you were transforming “a mind which wants as humans do”, which I think should instead be transformed to “an entity which flies as birds do”. And indeed planes don’t fly like birds do. [EDIT: two minutes or so after pressing enter on this comment, I now see how you could read it your way]
I guess if I had to make an analogy I would say that you have to be pretty similar to a human to think the way we do, but probably not to pursue the same ends, which is probably the point you cared about establishing.