I would rather the QM sequence was shortened to the low-controversy subset Eliezer described in An Intuitive Explanation of Quantum Mechanics and checked for technical accuracy. The pure MWI advocacy part belongs in some appendix, and the outright nonsense like “a Bayesian can become as smart as Einstein” should be chucked and never mentioned again.
He’s not making it up out of whole cloth, though he’s being significantly uncharitable.
More precisely, I think this is a reference to Einstein’s Speed and related Sequence posts, where EY argues that Einstein’s unusual success at understanding physical law was significantly due to updating on all available Bayesian evidence rather than just the subset of such evidence that non-Bayesian scientists use.
That said, it’s of course a huge jump from “Einstein would not have been as successful had he not been a Bayesian” to “any Bayesian can be as successful as Einstein,” and I don’t recall EY (or anyone else) ever making the latter claim.
More precisely, I think this is a reference to Einstein’s Speed and related Sequence posts, where EY argues that Einstein’s unusual success at understanding physical law was significantly due to updating on all available Bayesian evidence rather than just the subset of such evidence that non-Bayesian scientists use.
It seems to me that accusing someone of saying “outright nonsense” like this when they in fact did not say something like that and said only something vaguely related is an act that I would like to see discouraged when detected. Straw men are not welcome on lesswrong!
More precisely, the author ‘made this up’ because he believes it is acceptable to distort reality to that degree when arguing in this environment. Reception of the claim at the time I replied to it indicated that this belief is correct. I would prefer it if this were not so.
I haven’t found anything Eliezer’s written about Einstein to not be useful. Could you explain why you don’t like it (and/or specify what it is you dislike), or link me to an explanation?
If we want to shorten the QM stuff and explain MWI without belaboring its truth, I don’t think it would be out of the question to commission a specialist like Amit Hagar or David Albert to write a short explanation of what the QM-interpretation fuss is all about, insert that right before the more important QM implications philosophy-of-science stuff (Think like reality, When science can’t help, etc.), and then put Eliezer’s technical explanations in an appendix. That would do a lot to mitigate the criticism of the Sequences for uncredentialed nonstandard physics espousal, and it would lose fewer readers whose math or physics backgrounds are weak.
I haven’t found anything Eliezer’s written about Einstein to not be useful. Could you explain why you don’t like it (and/or specify what it is you dislike), or link me to an explanation?
What has been proven wrong is the idea that explicit Bayesian thinking gives a physicist a significant rather than a marginal advantage. I don’t know of any physicist who learned Bayesian thinking and suddenly became much more productive/successful/famous. You are likely to do better than without it, but you will never be as good as a noticeably smarter not-explicitly-Bayesian physicist, let alone Einstein.
Eliezer’s waxing poetic about Barbour, who is a fringe scientist with intriguing ideas but without many notable achievements, is high on pathos, but not very convincing.
I would rather the QM sequence was shortened to the low-controversy subset Eliezer described in An Intuitive Explanation of Quantum Mechanics and checked for technical accuracy. The pure MWI advocacy part belongs in some appendix, and the outright nonsense like “a Bayesian can become as smart as Einstein” should be chucked and never mentioned again.
+1
Citation needed. I believe shminux made this up.
He’s not making it up out of whole cloth, though he’s being significantly uncharitable.
More precisely, I think this is a reference to Einstein’s Speed and related Sequence posts, where EY argues that Einstein’s unusual success at understanding physical law was significantly due to updating on all available Bayesian evidence rather than just the subset of such evidence that non-Bayesian scientists use.
That said, it’s of course a huge jump from “Einstein would not have been as successful had he not been a Bayesian” to “any Bayesian can be as successful as Einstein,” and I don’t recall EY (or anyone else) ever making the latter claim.
It seems to me that accusing someone of saying “outright nonsense” like this when they in fact did not say something like that and said only something vaguely related is an act that I would like to see discouraged when detected. Straw men are not welcome on lesswrong!
More precisely, the author ‘made this up’ because he believes it is acceptable to distort reality to that degree when arguing in this environment. Reception of the claim at the time I replied to it indicated that this belief is correct. I would prefer it if this were not so.
I share your preference for discouraging straw men and uncharitable readings.
I haven’t found anything Eliezer’s written about Einstein to not be useful. Could you explain why you don’t like it (and/or specify what it is you dislike), or link me to an explanation?
If we want to shorten the QM stuff and explain MWI without belaboring its truth, I don’t think it would be out of the question to commission a specialist like Amit Hagar or David Albert to write a short explanation of what the QM-interpretation fuss is all about, insert that right before the more important QM implications philosophy-of-science stuff (Think like reality, When science can’t help, etc.), and then put Eliezer’s technical explanations in an appendix. That would do a lot to mitigate the criticism of the Sequences for uncredentialed nonstandard physics espousal, and it would lose fewer readers whose math or physics backgrounds are weak.
What has been proven wrong is the idea that explicit Bayesian thinking gives a physicist a significant rather than a marginal advantage. I don’t know of any physicist who learned Bayesian thinking and suddenly became much more productive/successful/famous. You are likely to do better than without it, but you will never be as good as a noticeably smarter not-explicitly-Bayesian physicist, let alone Einstein.
Eliezer’s waxing poetic about Barbour, who is a fringe scientist with intriguing ideas but without many notable achievements, is high on pathos, but not very convincing.