Sorry, I’ll be more concrete; “there’s a serious risk” is really vague wording. What would surprise me greatly is if I heard that Eliezer assigned even a 5% probability to there being a realistic quick fix to Roko’s argument that makes it work on humans. I think a larger reason for the ban was just that Eliezer was angry with Roko for trying to spread what Roko thought was an information hazard, and angry people lash out (even when it doesn’t make a ton of strategic sense).
Probably not a quick fix, but I would definitely say Eliezer gives significant chances (say, 10%) to there being some viable version of the Basilisk, which is why he actively avoids thinking about it.
If Eliezer was just angry at Roko, he would have yelled or banned Roko; instead, he banned all discussion of the subject. That doesn’t even make sense as a “slashing out” reaction against Roko.
It sounds like you have a different model of Eliezer (and of how well-targeted ‘lashing out’ usually is) than I do. But, like I said to V_V above:
According to Eliezer, he had three separate reasons for the original ban: (1) he didn’t want any additional people (beyond the one Roko cited) to obsess over the idea and get nightmares; (2) he was worried there might be some variant on Roko’s argument that worked, and he wanted more formal assurances that this wasn’t the case; and (3) he was just outraged at Roko. (Including outraged at him for doing something Roko thought would put people at risk of torture.)
The point I was making wasn’t that (2) had zero influence. It was that (2) probably had less influence than (3), and its influence was probably of the ‘small probability of large costs’ variety.
I don’t know enough about this to tell if (2) had more influence than (3) initially. I’m glad you agree that (2) had some influence, at least. That was the main part of my point.
How long did discussion of the Basilisk stay banned? Wasn’t it many years? How do you explain that, unless the influence of (2) was significant?
It seems unlikely that they would, if their gun is some philosophical decision theory stuff about blackmail from their future. I don’t expect that gun to ever fire, no matter how many times you click the trigger.
It seems we disagree on this factual issue. Eliezer does think there is a risk of acausal blackmail, or else he wouldn’t have banned discussion of it.
Sorry, I’ll be more concrete; “there’s a serious risk” is really vague wording. What would surprise me greatly is if I heard that Eliezer assigned even a 5% probability to there being a realistic quick fix to Roko’s argument that makes it work on humans. I think a larger reason for the ban was just that Eliezer was angry with Roko for trying to spread what Roko thought was an information hazard, and angry people lash out (even when it doesn’t make a ton of strategic sense).
Probably not a quick fix, but I would definitely say Eliezer gives significant chances (say, 10%) to there being some viable version of the Basilisk, which is why he actively avoids thinking about it.
If Eliezer was just angry at Roko, he would have yelled or banned Roko; instead, he banned all discussion of the subject. That doesn’t even make sense as a “slashing out” reaction against Roko.
It sounds like you have a different model of Eliezer (and of how well-targeted ‘lashing out’ usually is) than I do. But, like I said to V_V above:
The point I was making wasn’t that (2) had zero influence. It was that (2) probably had less influence than (3), and its influence was probably of the ‘small probability of large costs’ variety.
I don’t know enough about this to tell if (2) had more influence than (3) initially. I’m glad you agree that (2) had some influence, at least. That was the main part of my point.
How long did discussion of the Basilisk stay banned? Wasn’t it many years? How do you explain that, unless the influence of (2) was significant?
I believe he thinks that sufficiently clever idiots competing to shoot off their own feet will find some way to do so.
It seems unlikely that they would, if their gun is some philosophical decision theory stuff about blackmail from their future. I don’t expect that gun to ever fire, no matter how many times you click the trigger.
That is not what I said, and I’m also guessing you did not have a grandfather who taught you you gun safety.