I think Chris Langan and the CTMU are very interesting, and I there is an interesting and important challenge for LW readers to figure out how (and whether) to learn from Chris. Here are some things I think are true about Chris (and about me) and relevant to this challenge. (I do not feel ready to talk about the object level CTMU here, I am mostly just talking about Chris Langan.)
Chris has a legitimate claim of being approximately the smartest man alive according to IQ tests.
Chris wrote papers/books that make up a bunch of words there are defined circularly, and are difficult to follow. It is easy to mistake him for a complete crackpot.
Chris claims to have proven the existence of God.
Chris has been something-sort-of-like-canceled for a long time. (In the way that seems predictable when “World’s Smartest Man Proves Existence of God.”)
Chris has some followers that I think don’t really understand him. (In the way that seems predictable when “World’s Smartest Man Proves Existence of God.”)
Chris acts socially in a very nonstandard way that seems like a natural consequence of having much higher IQ than anyone else he has ever met. In particular, I think this manifests in part as an extreme lack of humility.
Chris is actually very pleasant to talk to if (like me) it does not bother you that he acts like he is much smarter than you.
I personally think the proof of the existence of God is kid of boring. It reads to me as kind of like “I am going to define God to be everything. Notice how this meets a bunch of the criteria people normally attribute to God. In the CTMU, the universe is mind-like. Notice how this meets a bunch more criteria people normally attribute to God.”
While the proof of the existence of God feels kind of mundane to me, Chris is the kind of person who chooses to interpret it as a proof of the existence of God. Further, he also has other more concrete supernatural-like and conspiracy-theory-like beliefs, that I expect most people here would want to bet against.
I find the CTMU in general interesting, (but I do not claim to understand it).
I have noticed many thoughts that come naturally to me that do not seem to come naturally to other people (e.g. about time or identity), where it appears to me that Chris Langan just gets it (as in he is independently generating it all).
For years, I have partially depended on a proxy when judging other people (e.g. for recommending funding) that is is something like “Do I, Scott, like where my own thoughts go in contact with the other person?” Chris Langan is approximately at the top according to this proxy.
I believe I and others here probably have a lot to learn from Chris, and arguments of the form “Chris confidently believes false thing X,” are not really a crux for me about this.
IQ is not the real think-oomph (and I think Chris agrees), but Chris is very smart, and one should be wary of clever arguers, especially when trying to learn from someone with much higher IQ than you.
I feel like I am spending (a small amount of) social credit in this comment, in that when I imagine a typical LWer thinking “oh, Scott semi-endorses Chris, maybe I should look into Chris,” I imagine the most likely result is that they will reach the conclusion is that Chris is a crackpot, and that Scott’s semi-endorsements should be trusted less.
In particular, I think this manifests in part as an extreme lack of humility.
I just want to note that, based on my personal interactions with Chris, I experience Chris’s “extreme lack of humility” similarly to how I experience Eliezer’s “extreme lack of humility”:
in both cases, I think they have plausibly calibrated beliefs about having identified certain philosophical questions that are of crucial importance to the future of humanity, that most of the world is not taking seriously,[1] leading them to feel a particular flavor of frustration that people often interpret as an extreme lack of humility
in both cases, they are in some senses incredibly humble in their pursuit of truth, doing their utmost to be extremely honest with themselves about where they’re confused
I believe I and others here probably have a lot to learn from Chris, and arguments of the form “Chris confidently believes false thing X,” are not really a crux for me about this.
I think Chris Langan and the CTMU are very interesting, and I there is an interesting and important challenge for LW readers to figure out how (and whether) to learn from Chris. Here are some things I think are true about Chris (and about me) and relevant to this challenge. (I do not feel ready to talk about the object level CTMU here, I am mostly just talking about Chris Langan.)
Chris has a legitimate claim of being approximately the smartest man alive according to IQ tests.
Chris wrote papers/books that make up a bunch of words there are defined circularly, and are difficult to follow. It is easy to mistake him for a complete crackpot.
Chris claims to have proven the existence of God.
Chris has been something-sort-of-like-canceled for a long time. (In the way that seems predictable when “World’s Smartest Man Proves Existence of God.”)
Chris has some followers that I think don’t really understand him. (In the way that seems predictable when “World’s Smartest Man Proves Existence of God.”)
Chris acts socially in a very nonstandard way that seems like a natural consequence of having much higher IQ than anyone else he has ever met. In particular, I think this manifests in part as an extreme lack of humility.
Chris is actually very pleasant to talk to if (like me) it does not bother you that he acts like he is much smarter than you.
I personally think the proof of the existence of God is kid of boring. It reads to me as kind of like “I am going to define God to be everything. Notice how this meets a bunch of the criteria people normally attribute to God. In the CTMU, the universe is mind-like. Notice how this meets a bunch more criteria people normally attribute to God.”
While the proof of the existence of God feels kind of mundane to me, Chris is the kind of person who chooses to interpret it as a proof of the existence of God. Further, he also has other more concrete supernatural-like and conspiracy-theory-like beliefs, that I expect most people here would want to bet against.
I find the CTMU in general interesting, (but I do not claim to understand it).
I have noticed many thoughts that come naturally to me that do not seem to come naturally to other people (e.g. about time or identity), where it appears to me that Chris Langan just gets it (as in he is independently generating it all).
For years, I have partially depended on a proxy when judging other people (e.g. for recommending funding) that is is something like “Do I, Scott, like where my own thoughts go in contact with the other person?” Chris Langan is approximately at the top according to this proxy.
I believe I and others here probably have a lot to learn from Chris, and arguments of the form “Chris confidently believes false thing X,” are not really a crux for me about this.
IQ is not the real think-oomph (and I think Chris agrees), but Chris is very smart, and one should be wary of clever arguers, especially when trying to learn from someone with much higher IQ than you.
I feel like I am spending (a small amount of) social credit in this comment, in that when I imagine a typical LWer thinking “oh, Scott semi-endorses Chris, maybe I should look into Chris,” I imagine the most likely result is that they will reach the conclusion is that Chris is a crackpot, and that Scott’s semi-endorsements should be trusted less.
I just want to note that, based on my personal interactions with Chris, I experience Chris’s “extreme lack of humility” similarly to how I experience Eliezer’s “extreme lack of humility”:
in both cases, I think they have plausibly calibrated beliefs about having identified certain philosophical questions that are of crucial importance to the future of humanity, that most of the world is not taking seriously,[1] leading them to feel a particular flavor of frustration that people often interpret as an extreme lack of humility
in both cases, they are in some senses incredibly humble in their pursuit of truth, doing their utmost to be extremely honest with themselves about where they’re confused
It feels worth noting that Chris Langan has written about Newcomb’s paradox in 1989, and that his resolution involves thinking in terms of being in a simulation, similarly to what Andrew Critch has written about.
I agree with this.
Thanks was looking for that link to his resolution of newcombs’ paradox.
Too funny! “You are “possessed” by Newcomb’s Demon, and whatever self-interest remains to you will make you take the black box only. (Q.E.D.)”
Would you kindly explain this? Because you think some of his world-models independently throw out great predictions, even if other models of his are dead wrong?
More like illuminating ontologies than great predictions, but yeah.