TDT, FAI (esp. CEV), acausal trading, MWI—regardless whether they are true or not, the level of criticism is lower than one would expect; either because of the Halo effect or ADS.
I see these things being discussed here from time to time. I don’t see any general booming of them, still less any increasing trend. Eliezer, of course, has boomed MWI quite strongly; but he is no longer here.
My impression is that inside LW they are usually assumed true, while outside LW they are usually assumed false or highly questionable. Again, I’m not saying that these theories are wrong, but the pattern looks suspicious; almost every LW’s non-mainstream belief can be traced back to Eliezer. What a coincidence. One of the possible explanations is the halo effect of the Sequences. Or they are actually underrated outside LW. Or my impressions are distorted.
Take MWI for example; apparently a lot of people are under the impression that LWers must be ~100% MWI fanatics. But the annual surveys report that lukewarm endorsements of MWI as the least bad QM interpretation covers, what, <50% of respondents? And it’s not clear to me that LW is even different from mainstream physicists, since the occasional polls of them show MWI keeps becoming more popular. It seems like people overgeneralize from the generally respectful treatment of MWI as a valid alternative (as opposed to early criticism of it as nonsense or crackpot pseudoscience) and from MWI topics being a lot more fun to discuss than, say, Copenhagen.
Or, global pandemics are regularly rated in the survey as a very concerning x-risk up there with AI, but are discussed much less; possibly because the risk of pandemics seems well-appreciated by society at large and there’s little new to discuss.
Similarly for some of the other stereotypical beliefs; critics like Stross and XiXiDu have been campaigning to turn Roko’s basilisk into the defining shibboleth of LW, but do even <5% of LWers take it seriously or as more than an obscure hypothetical in one superseded decision theory? (I don’t think so but in that case I can’t prove it with survey data.)
And with TDT and acausal trading, they’re technical and difficult enough, relying heavily on formal logic and decision theory, that it’s hard to make any comments on them at all, either pro or con. Personally, I don’t believe in acausal trading. But I also don’t ever come out and talk about it, because I don’t feel I understand it or UDT/TDT well, am not particularly interested in them, and have nothing new to contribute to conversations about them; so why would I write about them, and if I were writing about them, why would you or anyone want to read what I wrote?
TDT, FAI (esp. CEV), acausal trading, MWI—regardless whether they are true or not, the level of criticism is lower than one would expect; either because of the Halo effect or ADS.
I see these things being discussed here from time to time. I don’t see any general booming of them, still less any increasing trend. Eliezer, of course, has boomed MWI quite strongly; but he is no longer here.
My impression is that inside LW they are usually assumed true, while outside LW they are usually assumed false or highly questionable. Again, I’m not saying that these theories are wrong, but the pattern looks suspicious; almost every LW’s non-mainstream belief can be traced back to Eliezer. What a coincidence. One of the possible explanations is the halo effect of the Sequences. Or they are actually underrated outside LW. Or my impressions are distorted.
I’m going with distorted.
Take MWI for example; apparently a lot of people are under the impression that LWers must be ~100% MWI fanatics. But the annual surveys report that lukewarm endorsements of MWI as the least bad QM interpretation covers, what, <50% of respondents? And it’s not clear to me that LW is even different from mainstream physicists, since the occasional polls of them show MWI keeps becoming more popular. It seems like people overgeneralize from the generally respectful treatment of MWI as a valid alternative (as opposed to early criticism of it as nonsense or crackpot pseudoscience) and from MWI topics being a lot more fun to discuss than, say, Copenhagen.
Or, global pandemics are regularly rated in the survey as a very concerning x-risk up there with AI, but are discussed much less; possibly because the risk of pandemics seems well-appreciated by society at large and there’s little new to discuss.
Similarly for some of the other stereotypical beliefs; critics like Stross and XiXiDu have been campaigning to turn Roko’s basilisk into the defining shibboleth of LW, but do even <5% of LWers take it seriously or as more than an obscure hypothetical in one superseded decision theory? (I don’t think so but in that case I can’t prove it with survey data.)
And with TDT and acausal trading, they’re technical and difficult enough, relying heavily on formal logic and decision theory, that it’s hard to make any comments on them at all, either pro or con. Personally, I don’t believe in acausal trading. But I also don’t ever come out and talk about it, because I don’t feel I understand it or UDT/TDT well, am not particularly interested in them, and have nothing new to contribute to conversations about them; so why would I write about them, and if I were writing about them, why would you or anyone want to read what I wrote?