So, let me give a hypothetical scenario, tell me how such a group would hypothetically help in that scenario.
We’re in a planet controlled by mind-reading SpaceNazis. You suddenly realize through scientific research that all purple-eyed people (forming a 3 percent of the general population) are actually of SpaceJewish heritage. SpaceNazis kill people of SpaceJewish heritage. You are also purple-eyed btw.
This is effectively a basilisk in the sense of “harmful knowledge”. You’re shortly up for your standard monthly mindreading by the regime. What could your contact group do to help? Brainwash you (and themselves) into forgetting it?
Assuming that the mindreaders punish people that don’t turn in SpaceJews, trying to tell others about the problem is either suicide or mini-basilisk inducing, depending on that person’s disposition towards me. On the other hand, if I were to ask help from a group of other purple-eyed SpaceJews who already determined the same secret and were in no greater danger for that request, we would at least be slightly be more likely to come up with a solution better than “commit suicide to protect other purple-eyed SpaceJews.”
As such, the purpose of a closed group would be more of a way to negotiate the basilisk(s) in such a way that doesn’t create additional risk, because anyone informed of said basilisk(s) would either
A) Already be distracted by the same basilisk, creating zero net loss
B) Be distracted by an old basilisk, the idea being that their previous distraction/non-investment in the ideas that set up the new basilisk will render them less likely to get caught in the same loop and more likely to come up with a creative solution. As David_Gerard said,
From (anectdotal-level) observation of examples, the famous LW basilisk is something that you need a string
of things going wrong to be upset by: you need to believe certain Sequence memes, you need to believe they
imply particular things in a particular way, you need to be smart enough to understand for yourself how they
imply what they do, and you need to be obsessive in just the wrong way.
Suppose basilisk A is of consequence Z, and is known by John. David however, does not care either way about consequence Z, possibly because he already knows about basilisk B and is more concerned about consequence X, and John is in the same spot as David in the matter of which basilisk is more important. Since both are already being distracted by a basilisk either way, they could trade basilisks, each hoping that the other might come up with a resolution without worrying about spreading it to somebody who would actually suffer a decreased quality of life for it.
So, let me give a hypothetical scenario, tell me how such a group would hypothetically help in that scenario.
We’re in a planet controlled by mind-reading SpaceNazis. You suddenly realize through scientific research that all purple-eyed people (forming a 3 percent of the general population) are actually of SpaceJewish heritage. SpaceNazis kill people of SpaceJewish heritage. You are also purple-eyed btw.
This is effectively a basilisk in the sense of “harmful knowledge”. You’re shortly up for your standard monthly mindreading by the regime. What could your contact group do to help? Brainwash you (and themselves) into forgetting it?
Assuming that the mindreaders punish people that don’t turn in SpaceJews, trying to tell others about the problem is either suicide or mini-basilisk inducing, depending on that person’s disposition towards me. On the other hand, if I were to ask help from a group of other purple-eyed SpaceJews who already determined the same secret and were in no greater danger for that request, we would at least be slightly be more likely to come up with a solution better than “commit suicide to protect other purple-eyed SpaceJews.”
As such, the purpose of a closed group would be more of a way to negotiate the basilisk(s) in such a way that doesn’t create additional risk, because anyone informed of said basilisk(s) would either
A) Already be distracted by the same basilisk, creating zero net loss B) Be distracted by an old basilisk, the idea being that their previous distraction/non-investment in the ideas that set up the new basilisk will render them less likely to get caught in the same loop and more likely to come up with a creative solution. As David_Gerard said,
Suppose basilisk A is of consequence Z, and is known by John. David however, does not care either way about consequence Z, possibly because he already knows about basilisk B and is more concerned about consequence X, and John is in the same spot as David in the matter of which basilisk is more important. Since both are already being distracted by a basilisk either way, they could trade basilisks, each hoping that the other might come up with a resolution without worrying about spreading it to somebody who would actually suffer a decreased quality of life for it.