It seems to me that it works more as a cop out when people accuse(d?) them of not publishing much: “We are doing research, we promise, but it is just too dangerous for the uninitiated, humanity is not ready and stuff.”
I don’t think I’ve ever seen them use it as an excuse like that. And I wouldn’t expect them to, since as far as I know(!) they focus mainly on pure FAI stuff like formal verification or whatever. And I can’t imagine that they would lie to claim that they have secrets that would help build an AGI but which they can’t release for humanity’s sake—it would make them sound silly, and possibly even make them a target (although I admit this is far-fetched).
As for basilisks, I abide by the principle that if a highly intelligent person or group of people with professional expertise in field X say “this thing related to X is dangerous to do”, I will at least try to supress my “push the button” impulse. Even if I’m pretty sure that they’re wrong, by my own judgement
I think the whole basilisk thing is not (entirely) because it is “dangerous” but because they don’t want to be accused of extorting more fragile or gullible visitors. Think about it this way—if EY went full L Ron Hubbard, wouldn’t the basilisk be one of his primary tools of cult control?
That said, I consider the whole thing pretty silly.
Well, but EY is still using arguments in the form “donate to us or the future of the Galactic Civilization is at risk”, I don’t thik the Basilisk would make much difference. If anything, EY could just declare the Basilisk invalid. His behavior is not consistent with him beliving the argument.
Well, still better than “donate to us or you’ll go to hell”.
How about “don’t donate to them or you’ll go to hell”. That’s what they fear. Think about it, who is more likely to exist, according to their beliefs, Satan or God? And would Satan have a problem with using such tactics in order to make people dismiss God?
I don’t think I’ve ever seen them use it as an excuse like that. And I wouldn’t expect them to, since as far as I know(!) they focus mainly on pure FAI stuff like formal verification or whatever. And I can’t imagine that they would lie to claim that they have secrets that would help build an AGI but which they can’t release for humanity’s sake—it would make them sound silly, and possibly even make them a target (although I admit this is far-fetched).
As for basilisks, I abide by the principle that if a highly intelligent person or group of people with professional expertise in field X say “this thing related to X is dangerous to do”, I will at least try to supress my “push the button” impulse. Even if I’m pretty sure that they’re wrong, by my own judgement
I think the whole basilisk thing is not (entirely) because it is “dangerous” but because they don’t want to be accused of extorting more fragile or gullible visitors. Think about it this way—if EY went full L Ron Hubbard, wouldn’t the basilisk be one of his primary tools of cult control?
That said, I consider the whole thing pretty silly.
Well, but EY is still using arguments in the form “donate to us or the future of the Galactic Civilization is at risk”, I don’t thik the Basilisk would make much difference. If anything, EY could just declare the Basilisk invalid. His behavior is not consistent with him beliving the argument.
Well, still better than “donate to us or you’ll go to hell”.
How about “don’t donate to them or you’ll go to hell”. That’s what they fear. Think about it, who is more likely to exist, according to their beliefs, Satan or God? And would Satan have a problem with using such tactics in order to make people dismiss God?
ph’nglui mglw’nafh UFAI R’lyeh wgah’nagl fhtagn