Outside view indicates that if you stare at the basilisk, you will most likely either a) think it was a terrible idea and wish you hadn’t and maybe have nightmares, or b) wonder what the heck all the fuss is about and consider it a waste of time except insofar as you might consider censorship wrong in itself, and might thereby be tempted to share the basilisk with others, each of whom has an independent risk of suffering reaction (a).
Turns out, the basilisk was very close to one of the list of things I’d thought up, based on the nature of this community’s elders, and gone “no, no, they wouldn’t buy into that idea would they? No-one here would fall for that...”.
Reading about it, combined with the knowledge that EY banned it, gives me an insight into EYs thought patterns that significantly decreases my respect for him. I think that that insight was worth the effort involved in reading it.
Honestly I was suprised at EY’s reaction. I thought he had figured out things like that problem and would tear it to pieces rather than become. Possibly I’m not as smart as him, but even presuming Roko’s right you would think Rationalists Should Win. Plus, I think Eliezer has publicly published something similar to the Basilisk, albeit much weaker and without being explicitly basilisk like, so I’d have thought he would have worked out a solution. (EDIT: No, turns out it was someone else who came up with it. It wasn’t really fleshed out so Eliezer may not have thought much of it or never noticed it in the first place.)
The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive. Plus, having seen Dogma, I get that the post could be an existential risk...
The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive.
I don’t think hiding it will prevent people getting upset. In fact, hiding it may make people more likely to believe it, and thus get scared. If someone respects EY and EY says “this thing you’ve seen is a basilisk” then they’re more likely to be scared than if EY says “this thing you’ve seen is nonsense”
Plus, having seen Dogma, I get that the post could be an existential risk...
My understanding is that the post isn’t the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.
But if you want to show that you won’t be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can’t come up with a metric by which EY’s approach is reasonable.
My understanding is that the post isn’t the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.
(Concerns not necessarily limited to either existential or UFAI, but we cannot discuss that here.)
But if you want to show that you won’t be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can’t come up with a metric by which EY’s approach is reasonable.
Yes, but not in the way you seem to be saying. I was semi-joking here, in that the post could spook people enough to increase x-risks (which wfg seems to be trying to do, albeit as blackmail rather than for its own sake). I was referring to how in the film Dogma gjb snyyra natryf, gb nibvq uryy, nggrzcg gb qrfgebl nyy ernyvgl. (rot13′d for spoilers, and in case it’s too suggestive of the Basilisk)
if we let on we can be manipulated that way, then a UFAI can do extra harm.
It can? I suppose I just don’t get decision theory. The non-basilisk part of that post left me pretty much baffled.
Brennan is a fucking retard. No, you don’t want to know. You want to signal affiliation with desirable groups, to send hard-to-fake signals of desirable presonality traits such as loyalty, intelligence, power and the presence of informed allies. You want to say everything bad you possibly can about the outgroup and everything good about the ingroup. You want to preech altruism and then make a plausible but unlikely reasoning error which conveniently stops you from having to give away anything costly.
All the other humans do all of these things. This is the true way of our kind. You will be punished if you deviate from the way, or even if you try to overtly mention that this is the way.
You seem to be talking mainly in part (a) about the pseudo-basilisk rather than the basilisk itself. I suspect that most people who are vulnerable to the pseudo-basilisk are either mentally ill or vulnerable to having similar issues simply when thinking about the long-term implications of the second law of thermodynamics or the like. If one is strongly vulnerable to that sort of disturbing idea then between known laws of physics and nasty low probability claims made by some major religions, most basiliking of this sort is already well-covered.
Outside view indicates that if you stare at the basilisk, you will most likely either a) think it was a terrible idea and wish you hadn’t and maybe have nightmares, or b) wonder what the heck all the fuss is about and consider it a waste of time except insofar as you might consider censorship wrong in itself, and might thereby be tempted to share the basilisk with others, each of whom has an independent risk of suffering reaction (a).
Do you want to want to stare at the basilisk?
I’ll add my opinion to the list:
I’m not an a, or a b.
Turns out, the basilisk was very close to one of the list of things I’d thought up, based on the nature of this community’s elders, and gone “no, no, they wouldn’t buy into that idea would they? No-one here would fall for that...”.
Reading about it, combined with the knowledge that EY banned it, gives me an insight into EYs thought patterns that significantly decreases my respect for him. I think that that insight was worth the effort involved in reading it.
Honestly I was suprised at EY’s reaction. I thought he had figured out things like that problem and would tear it to pieces rather than become. Possibly I’m not as smart as him, but even presuming Roko’s right you would think Rationalists Should Win. Plus, I think Eliezer has publicly published something similar to the Basilisk, albeit much weaker and without being explicitly basilisk like, so I’d have thought he would have worked out a solution. (EDIT: No, turns out it was someone else who came up with it. It wasn’t really fleshed out so Eliezer may not have thought much of it or never noticed it in the first place.)
The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive. Plus, having seen Dogma, I get that the post could be an existential risk...
I don’t think hiding it will prevent people getting upset. In fact, hiding it may make people more likely to believe it, and thus get scared. If someone respects EY and EY says “this thing you’ve seen is a basilisk” then they’re more likely to be scared than if EY says “this thing you’ve seen is nonsense”
My understanding is that the post isn’t the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.
But if you want to show that you won’t be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can’t come up with a metric by which EY’s approach is reasonable.
(Concerns not necessarily limited to either existential or UFAI, but we cannot discuss that here.)
Agree. :)
Yes, but not in the way you seem to be saying. I was semi-joking here, in that the post could spook people enough to increase x-risks (which wfg seems to be trying to do, albeit as blackmail rather than for its own sake). I was referring to how in the film Dogma gjb snyyra natryf, gb nibvq uryy, nggrzcg gb qrfgebl nyy ernyvgl. (rot13′d for spoilers, and in case it’s too suggestive of the Basilisk)
It can? I suppose I just don’t get decision theory. The non-basilisk part of that post left me pretty much baffled.
Brennan is a fucking retard. No, you don’t want to know. You want to signal affiliation with desirable groups, to send hard-to-fake signals of desirable presonality traits such as loyalty, intelligence, power and the presence of informed allies. You want to say everything bad you possibly can about the outgroup and everything good about the ingroup. You want to preech altruism and then make a plausible but unlikely reasoning error which conveniently stops you from having to give away anything costly.
All the other humans do all of these things. This is the true way of our kind. You will be punished if you deviate from the way, or even if you try to overtly mention that this is the way.
This may be the way now, but it doesn’t have to be the way always.
You seem to be talking mainly in part (a) about the pseudo-basilisk rather than the basilisk itself. I suspect that most people who are vulnerable to the pseudo-basilisk are either mentally ill or vulnerable to having similar issues simply when thinking about the long-term implications of the second law of thermodynamics or the like. If one is strongly vulnerable to that sort of disturbing idea then between known laws of physics and nasty low probability claims made by some major religions, most basiliking of this sort is already well-covered.
Minus the censorship part, that’s not worse than watching Saw.
One can receive partial-impact synopses of Saw without risking the full effect, and gauge their susceptibility with more information on hand.
There’s a reason I’ve refrained from seeking out 2 Girls 1 Cup.
(I should stop bringing it into my mind, really.)
True. I think that after reading the debate(s) about the censored post one should have a pretty good idea of what it is, though.
My own reaction was
c) More.
Yes, I know. I’m hopelessly stupid.