I saw the original idea and the discussion around it, but I was (fortunately) under stress at the time and initially dismissed it as so implausible as to be unworthy of serious consideration. Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I’m not so sure. I do know enough to say that, if the idea is something that should be taken seriously, it’s really serious. I can tell you that I am quite happy that the original posts are no longer present, because if they were I am moderately confident that I would want to go back and see if I could make more sense out of the matter, and if Eliezer, Alicorn, and Roko are right about this, making sense out of the matter would be seriously detrimental to my health.
Thankfully, either it’s a threat but I don’t understand it fully, in which case I’m safe, or it’s not a threat, in which case I’m also safe. But I am sufficiently concerned about the possibility that it’s a threat that I don’t understand fully but might be able to realize independently given enough thought that I’m consciously avoiding extended thought about this matter. I will respond to posts that directly relate to this one but am otherwise done with this topic—rest assured that, if you missed this one, you’re really quite all right for it!
Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I’m not so sure.
This line of argument really bothers me. What does it mean for E, A, and R to seem very intelligent? As far as I can tell, the necessary conclusion is “I will believe a controversial statement of theirs without considering it.” When you word it like that, the standards are a lot higher than “seem very intelligent”, or at least narrower- you need to know their track record on decisions like this.
(The controversial statement is “you don’t want to know about X,” not X itself, by the way.)
I am willing to accept the idea that (intelligent) specialists in a field may know more about their field than nonspecialists and are therefore more qualified to evaluate matters related to their field than I.
Good point, though I would point out that you need E, A, and R to be specialists when it comes to how people react to X, not just X, and I would say there’s evidence that’s not true.
I agree, but I know what conclusion I would draw from the belief in question if I actually believed it, so the issue of their knowledge of how people react is largely immaterial to me in particular. I was mostly posting to provide a data point in favor of keeping the material off LW, not to attempt to dissolve the issue completely or anything.
When you word it like that, the standards are a lot higher than “seem very intelligent”, or at least narrower- you need to know their track record on decisions like this.
You don’t need any specific kind of proof, you already have some state of knowledge about correctness of such statements. There is no “standard of evidence” for forming a state of knowledge, it just may be that without the evidence that meets that “standard” you don’t expect to reach some level of certainty, or some level of stability of your state of knowledge (i.e. low expectation of changing your mind).
I saw the original idea and the discussion around it, but I was (fortunately) under stress at the time and initially dismissed it as so implausible as to be unworthy of serious consideration. Given the reactions to it by Eliezer, Alicorn, and Roko, who seem very intelligent and know more about this topic than I do, I’m not so sure. I do know enough to say that, if the idea is something that should be taken seriously, it’s really serious. I can tell you that I am quite happy that the original posts are no longer present, because if they were I am moderately confident that I would want to go back and see if I could make more sense out of the matter, and if Eliezer, Alicorn, and Roko are right about this, making sense out of the matter would be seriously detrimental to my health.
Thankfully, either it’s a threat but I don’t understand it fully, in which case I’m safe, or it’s not a threat, in which case I’m also safe. But I am sufficiently concerned about the possibility that it’s a threat that I don’t understand fully but might be able to realize independently given enough thought that I’m consciously avoiding extended thought about this matter. I will respond to posts that directly relate to this one but am otherwise done with this topic—rest assured that, if you missed this one, you’re really quite all right for it!
This line of argument really bothers me. What does it mean for E, A, and R to seem very intelligent? As far as I can tell, the necessary conclusion is “I will believe a controversial statement of theirs without considering it.” When you word it like that, the standards are a lot higher than “seem very intelligent”, or at least narrower- you need to know their track record on decisions like this.
(The controversial statement is “you don’t want to know about X,” not X itself, by the way.)
I am willing to accept the idea that (intelligent) specialists in a field may know more about their field than nonspecialists and are therefore more qualified to evaluate matters related to their field than I.
Good point, though I would point out that you need E, A, and R to be specialists when it comes to how people react to X, not just X, and I would say there’s evidence that’s not true.
I agree, but I know what conclusion I would draw from the belief in question if I actually believed it, so the issue of their knowledge of how people react is largely immaterial to me in particular. I was mostly posting to provide a data point in favor of keeping the material off LW, not to attempt to dissolve the issue completely or anything.
You don’t need any specific kind of proof, you already have some state of knowledge about correctness of such statements. There is no “standard of evidence” for forming a state of knowledge, it just may be that without the evidence that meets that “standard” you don’t expect to reach some level of certainty, or some level of stability of your state of knowledge (i.e. low expectation of changing your mind).