And yet, it doesn’t seem all that rare for you to post something saying “I just discovered another idiotic bug in my mental functioning. So I bypassed the gibson using my self-transcending transcendence-transmogrification method, and I’m better now.” To my cynical eye, there seems to be some tension here.
As a programmer, I will charitably note that it’s not uncommon for a more serious bug to mask other more subtle ones; fixing the big one is still good, even if the program may look just as badly broken afterwards. Judging from his blog, he’s doing well enough for himself, and if he was in a pretty bad state to begin with his claims may be justified. There’s a difference between “I fixed the emotional hang-up that was making this chore hard to do” and “I’ve fixed a crippling, self-reinforcing terror of failure that kept me from doing anything with my life”.
That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture—but then, the same could be said of Eliezer, and he’s clearly won many people over with his ideas.
As a programmer, I will charitably note that it’s not uncommon for a more serious bug to mask other more subtle ones; fixing the big one is still good, even if the program may look just as badly broken afterwards.
And one of the unfortunate things about the human architecture is that the more global a belief/process is, the more invisible it is… which is rather the opposite of what happens in normal computer programming. That makes high-level errors much harder to spot than low-level ones.
First year or so, I spent way too much time dealing with “hangups making this chore hard to do”, and not realizing that the more important hangups are about why you think you need to do them in the first place. So it has been taking a while to climb the abstraction tree.
For another thing, certain processes are difficult to spot because they’re cyclical over a longer time period. I recently realized that I was addicted to getting insight into problems, when it wasn’t really necessary to understand them in order to fix them, even at the relatively shallow level of understanding I usually worked with. In effect, insight was just a way of convincing myself to “lower the anti-persuasion shields”.
The really crazy/annoying thing is I keep finding evidence that other people have figured ALL of this stuff out before, but either couldn’t explain it or convince anybody else to take it seriously. (That doesn’t make me question the validity of what I’ve found, but it does make me question whether I’ll be able to explain/convince any more successfully than the rest did.)
That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture
Heh, you think mine are grandiose, you should hear the claims that other people make for what are basically the same techniques! I’m actually quite modest. ;-)
“That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture—but then, the same could be said of Eliezer, and he’s clearly won many people over with his ideas.”
Precisely the point. We’re not interested in how to attract people to doctrines (or at least I’m not), but in determining what is true and finding ever-better ways to determine what is true.
The popularity of some idea is absolutely irrelevant in itself. We need evidence of coherence and accuracy, not prestige, in order to reach intelligent conclusions.
The popularity of some idea is absolutely irrelevant in itself.
Compelling, but false. Ideas’ popularity not only contributes network effects to their usefulness (which might be irrelevant by your criteria), but it also provides evidence that they’re worth considering.
As a programmer, I will charitably note that it’s not uncommon for a more serious bug to mask other more subtle ones; fixing the big one is still good, even if the program may look just as badly broken afterwards. Judging from his blog, he’s doing well enough for himself, and if he was in a pretty bad state to begin with his claims may be justified. There’s a difference between “I fixed the emotional hang-up that was making this chore hard to do” and “I’ve fixed a crippling, self-reinforcing terror of failure that kept me from doing anything with my life”.
That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture—but then, the same could be said of Eliezer, and he’s clearly won many people over with his ideas.
And one of the unfortunate things about the human architecture is that the more global a belief/process is, the more invisible it is… which is rather the opposite of what happens in normal computer programming. That makes high-level errors much harder to spot than low-level ones.
First year or so, I spent way too much time dealing with “hangups making this chore hard to do”, and not realizing that the more important hangups are about why you think you need to do them in the first place. So it has been taking a while to climb the abstraction tree.
For another thing, certain processes are difficult to spot because they’re cyclical over a longer time period. I recently realized that I was addicted to getting insight into problems, when it wasn’t really necessary to understand them in order to fix them, even at the relatively shallow level of understanding I usually worked with. In effect, insight was just a way of convincing myself to “lower the anti-persuasion shields”.
The really crazy/annoying thing is I keep finding evidence that other people have figured ALL of this stuff out before, but either couldn’t explain it or convince anybody else to take it seriously. (That doesn’t make me question the validity of what I’ve found, but it does make me question whether I’ll be able to explain/convince any more successfully than the rest did.)
Heh, you think mine are grandiose, you should hear the claims that other people make for what are basically the same techniques! I’m actually quite modest. ;-)
“That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture—but then, the same could be said of Eliezer, and he’s clearly won many people over with his ideas.”
Precisely the point. We’re not interested in how to attract people to doctrines (or at least I’m not), but in determining what is true and finding ever-better ways to determine what is true.
The popularity of some idea is absolutely irrelevant in itself. We need evidence of coherence and accuracy, not prestige, in order to reach intelligent conclusions.
Compelling, but false. Ideas’ popularity not only contributes network effects to their usefulness (which might be irrelevant by your criteria), but it also provides evidence that they’re worth considering.