you don’t give me a very good reason to think basilisks are anything more than specific instances of OCD or Depressive or other mentally ill spirals. If you think we should form a mental illness support group for Lesswrongers, I wholeheartedly support that, but let’s leave out basilisks until they actually come up.
From (anecdotal-level) observation of examples, the famous LW basilisk is something that you need a string of things going wrong to be upset by: you need to believe certain Sequence memes, you need to believe they string together to imply particular things in a particular way, you need to be smart enough to understand for yourself how they imply what they do, and you need to be obsessive in just the wrong way.
The question then is what to do about it. Freedom of inquiry is absolutely necessary for science as done by mere humans to actually work, but this is not happening for various reasons that seemed good ideas at the time.
Wouldn’t a mental illness group targeted to Lesswrongers be about basilisk-like problems, since basilisks are more prominently mentioned here?
If there’s no one else who has basilisks to share, then no action ought to be taken regarding basilisks, yes. However, if other people do have basilisks, then something ought to be done, preferably in a closed environment. The thing is, right now we have no sure way of knowing if people have basilisks, because no one in their right mind would actually tell others the details without prompting.
In any case, if somebody else has a basilisk, they might come up and comment with “Yeah, let’s do it”. If nobody else has basilisks, then that won’t happen, and only then will the group be proven unnecessary. I’d rather wait and see whether that happens or not.
The base rate for any mental illness is hugely higher than the base rate for people who get basilisked even within Lesswrong. I think a group talking about rational, empirical, and practical ways to deal with having or knowing people who have various mental variances would be pretty cool. I think a basilisk group would on the other hand get few to no people talking in it or get swiftly banhammered
I realize it’s currently 10 months later, but as someone who is of indeterminate diagnosis but very definitely not neurotypical, this would actually be very, very useful. I was extremely disappointed to find that the LessWrong wiki page for “corrupted hardware” just lists a few references to Sequences referring to evolutionary psychology (an often-mildly-dubious science in the first place) rather than an abundance of references to specific hardware corruptions and how to deal with them.
I recommend you start a thread in discussion or post in the open thread about this question! People generally like to post to new “advice repository” style threads, and people like talking about themselves, and this can be both.
So, let me give a hypothetical scenario, tell me how such a group would hypothetically help in that scenario.
We’re in a planet controlled by mind-reading SpaceNazis. You suddenly realize through scientific research that all purple-eyed people (forming a 3 percent of the general population) are actually of SpaceJewish heritage. SpaceNazis kill people of SpaceJewish heritage. You are also purple-eyed btw.
This is effectively a basilisk in the sense of “harmful knowledge”. You’re shortly up for your standard monthly mindreading by the regime. What could your contact group do to help? Brainwash you (and themselves) into forgetting it?
Assuming that the mindreaders punish people that don’t turn in SpaceJews, trying to tell others about the problem is either suicide or mini-basilisk inducing, depending on that person’s disposition towards me. On the other hand, if I were to ask help from a group of other purple-eyed SpaceJews who already determined the same secret and were in no greater danger for that request, we would at least be slightly be more likely to come up with a solution better than “commit suicide to protect other purple-eyed SpaceJews.”
As such, the purpose of a closed group would be more of a way to negotiate the basilisk(s) in such a way that doesn’t create additional risk, because anyone informed of said basilisk(s) would either
A) Already be distracted by the same basilisk, creating zero net loss
B) Be distracted by an old basilisk, the idea being that their previous distraction/non-investment in the ideas that set up the new basilisk will render them less likely to get caught in the same loop and more likely to come up with a creative solution. As David_Gerard said,
From (anectdotal-level) observation of examples, the famous LW basilisk is something that you need a string
of things going wrong to be upset by: you need to believe certain Sequence memes, you need to believe they
imply particular things in a particular way, you need to be smart enough to understand for yourself how they
imply what they do, and you need to be obsessive in just the wrong way.
Suppose basilisk A is of consequence Z, and is known by John. David however, does not care either way about consequence Z, possibly because he already knows about basilisk B and is more concerned about consequence X, and John is in the same spot as David in the matter of which basilisk is more important. Since both are already being distracted by a basilisk either way, they could trade basilisks, each hoping that the other might come up with a resolution without worrying about spreading it to somebody who would actually suffer a decreased quality of life for it.
He’s not trying to give that reason for very good reasons...
edit: just realized it might be misinterpreted as me taking it seriously.
Look. I’m #10 on topcoder marathon match impressive debuts page, of all time, that was 4.5 years ago, I was still a newbie, it was the first programming contest of any kind I ever done. Their Elo-like ranking system which penalizes losing to newbies, combined with a contest where you could as well test everything offline prompted many high ranked contestants not to submit solutions. Impairing my score bump. Trust me, I can understand the math.
I’m not concerned with it actually working, and neither should you be. I’m rather bored of this topic and this [what ever it is] deletes counter arguments to it, which is really weird, but if you actually have some anguish (and the tales of the OCD suffering are not some sort of urban legend) you can mail me and I’ll talk you out of it, or try to.
you don’t give me a very good reason to think basilisks are anything more than specific instances of OCD or Depressive or other mentally ill spirals. If you think we should form a mental illness support group for Lesswrongers, I wholeheartedly support that, but let’s leave out basilisks until they actually come up.
From (anecdotal-level) observation of examples, the famous LW basilisk is something that you need a string of things going wrong to be upset by: you need to believe certain Sequence memes, you need to believe they string together to imply particular things in a particular way, you need to be smart enough to understand for yourself how they imply what they do, and you need to be obsessive in just the wrong way.
The question then is what to do about it. Freedom of inquiry is absolutely necessary for science as done by mere humans to actually work, but this is not happening for various reasons that seemed good ideas at the time.
refs: a call to decompartmentalise, as compartmentalisation is in fact an epistemic sin; the dangers of doing so.
Wouldn’t a mental illness group targeted to Lesswrongers be about basilisk-like problems, since basilisks are more prominently mentioned here?
If there’s no one else who has basilisks to share, then no action ought to be taken regarding basilisks, yes. However, if other people do have basilisks, then something ought to be done, preferably in a closed environment. The thing is, right now we have no sure way of knowing if people have basilisks, because no one in their right mind would actually tell others the details without prompting.
In any case, if somebody else has a basilisk, they might come up and comment with “Yeah, let’s do it”. If nobody else has basilisks, then that won’t happen, and only then will the group be proven unnecessary. I’d rather wait and see whether that happens or not.
The base rate for any mental illness is hugely higher than the base rate for people who get basilisked even within Lesswrong. I think a group talking about rational, empirical, and practical ways to deal with having or knowing people who have various mental variances would be pretty cool. I think a basilisk group would on the other hand get few to no people talking in it or get swiftly banhammered
I realize it’s currently 10 months later, but as someone who is of indeterminate diagnosis but very definitely not neurotypical, this would actually be very, very useful. I was extremely disappointed to find that the LessWrong wiki page for “corrupted hardware” just lists a few references to Sequences referring to evolutionary psychology (an often-mildly-dubious science in the first place) rather than an abundance of references to specific hardware corruptions and how to deal with them.
I recommend you start a thread in discussion or post in the open thread about this question! People generally like to post to new “advice repository” style threads, and people like talking about themselves, and this can be both.
Actually, that’s a pretty good idea. I think I’ll do that.
So, let me give a hypothetical scenario, tell me how such a group would hypothetically help in that scenario.
We’re in a planet controlled by mind-reading SpaceNazis. You suddenly realize through scientific research that all purple-eyed people (forming a 3 percent of the general population) are actually of SpaceJewish heritage. SpaceNazis kill people of SpaceJewish heritage. You are also purple-eyed btw.
This is effectively a basilisk in the sense of “harmful knowledge”. You’re shortly up for your standard monthly mindreading by the regime. What could your contact group do to help? Brainwash you (and themselves) into forgetting it?
Assuming that the mindreaders punish people that don’t turn in SpaceJews, trying to tell others about the problem is either suicide or mini-basilisk inducing, depending on that person’s disposition towards me. On the other hand, if I were to ask help from a group of other purple-eyed SpaceJews who already determined the same secret and were in no greater danger for that request, we would at least be slightly be more likely to come up with a solution better than “commit suicide to protect other purple-eyed SpaceJews.”
As such, the purpose of a closed group would be more of a way to negotiate the basilisk(s) in such a way that doesn’t create additional risk, because anyone informed of said basilisk(s) would either
A) Already be distracted by the same basilisk, creating zero net loss B) Be distracted by an old basilisk, the idea being that their previous distraction/non-investment in the ideas that set up the new basilisk will render them less likely to get caught in the same loop and more likely to come up with a creative solution. As David_Gerard said,
Suppose basilisk A is of consequence Z, and is known by John. David however, does not care either way about consequence Z, possibly because he already knows about basilisk B and is more concerned about consequence X, and John is in the same spot as David in the matter of which basilisk is more important. Since both are already being distracted by a basilisk either way, they could trade basilisks, each hoping that the other might come up with a resolution without worrying about spreading it to somebody who would actually suffer a decreased quality of life for it.
He’s not trying to give that reason for very good reasons...
edit: just realized it might be misinterpreted as me taking it seriously.
Look. I’m #10 on topcoder marathon match impressive debuts page, of all time, that was 4.5 years ago, I was still a newbie, it was the first programming contest of any kind I ever done. Their Elo-like ranking system which penalizes losing to newbies, combined with a contest where you could as well test everything offline prompted many high ranked contestants not to submit solutions. Impairing my score bump. Trust me, I can understand the math.
I’m not concerned with it actually working, and neither should you be. I’m rather bored of this topic and this [what ever it is] deletes counter arguments to it, which is really weird, but if you actually have some anguish (and the tales of the OCD suffering are not some sort of urban legend) you can mail me and I’ll talk you out of it, or try to.