Science isn’t fair. That’s sorta the point. An aspiring rationalist in 2007 starts with a huge advantage over an aspiring rationalist in 1957. It’s how we know that progress has occurred.
To me the thought of voluntarily embracing a system explicitly tied to the beliefs of one human being, who’s dead, falls somewhere between the silly and the suicidal.
So it’s not that Eliezer is a better philosopher. Kant might easily have been a better philosopher, though it’s true I haven’t read Kant. But I expect Eliezer to be more advanced by having started from a higher baseline.
(However, I do suspect that Eliezer (like most of us) isn’t skilled enough at the art he described, because as far as I’ve seen, the chain of reasoning in his expectation of ruinous AGI on a short timeline seems, to me, surprisingly incomplete and unconvincing. My P(near-term doom) is shifted upward as much based on his reputation as anything else, which is not how it should be. Though my high P(long-term doom) is more self-generated and recently shifted down by others.)
Reminds me of a Yudkowsky quote:
So it’s not that Eliezer is a better philosopher. Kant might easily have been a better philosopher, though it’s true I haven’t read Kant. But I expect Eliezer to be more advanced by having started from a higher baseline.
(However, I do suspect that Eliezer (like most of us) isn’t skilled enough at the art he described, because as far as I’ve seen, the chain of reasoning in his expectation of ruinous AGI on a short timeline seems, to me, surprisingly incomplete and unconvincing. My P(near-term doom) is shifted upward as much based on his reputation as anything else, which is not how it should be. Though my high P(long-term doom) is more self-generated and recently shifted down by others.)