They are not basilisks. Basilisks do real damage to you, they don’t make you nauseous. To get the general idea of the concept see RationalWiki. That illustrates what apparently constituted a basilisk for certain psychologically vulnerable individuals. Of course others would warn you that you shouldn’t discover what an alleged basilisk is on the off chance that it actually is one!
Are there any useful links to stuff on this concept? I expect that being exposed to ideas that mess with your mind might be a good way to develop the ‘mental immune system’.
That particular event is a rather strange bit of early LW lore. It seems to have largely passed out of the site’s public consciousness now, but for a while it cast a long shadow: doing a site search on “forbidden topic” ought to give you an outline of the opinion surrounding it.
I’d advise against deliberately seeking out allegedly harmful knowledge in order to expose yourself to it: I’m aware of no particular evidence that the mind responds to memetic threats (as opposed to non-memetic stresses) by hardening itself against them, and with confirmation bias and group identification behavior in mind there are a number of reasons why that might not be the case. I should probably temper that by admitting I haven’t always followed my own advice here, though.
On the other hand, I do think there’s room for a more general theory of harmful knowledge. While some of the groundwork has been laid, and we have a few ad-hoc guidelines in place, we don’t yet have a good consensus on epistemic safety, as the comments on the World of Warcraft thread (to say nothing of this one!) demonstrate. About as close as I’ve seen anyone get is Nick Bostrom’s 2009 paper on information hazards, but it limits itself to typology. Contributing to such a theory might be a valuable thing to pursue, if you’re determined to risk your sanity.
As far as I can tell, people’s vulnerability to memetic hazards that drive some people but not others insane should be very predictable. Granted that there are problems with retrospectively changing one’s outlook to try and defend against some of them, it shouldn’t be too hard to test someone to see if they already have appropriate cached response defenses up without exposing them to the idea itself.
I don’t think I’d go as far as deliberately risking my sanity (such as it is).
On the other hand, I do think there’s room for a more general theory of harmful knowledge. While some of the groundwork has been laid, and we have a few ad-hoc guidelines in place, we don’t yet have a good consensus on epistemic safety, …
So has knowledge that is harmful in more than specific situations been demonstrated to exist, or are you referring to theorising?
Depends what bounds you want to put on it. Basilisk-like knowledge (what the Bostrom paper calls a neuropsychological hazard) affecting the human cognitive architecture has not as far as I know been demonstrated to exist. Several other context-dependent but still fairly general informational hazards (ideological, for example) do clearly exist, though, and many of them seem poorly understood.
The forbidden topic in particular seems to belong to an interesting family of reflective hazards that hasn’t gotten much attention at all, although for the sake of local norms I’d rather not devote too much attention to it here.
Doh. Maybe I’m too tired so my brain is working less well than I’d hope, but I hadn’t noticed the link to the Bostrom paper there. I need to try to more carefully read through the stuff people say to me.
I’ll give the paper a read-through tomorrow.
[edit] I scanned the paper, but the tiny section on neuropsychological hazard seemed to tend toward the low-level (photosensitive epilepsy as one example), rather than the Lovecraftian (as I might have expected it to, if I had thought carefully about it, since I don’t place much credence in high-level ideas that could blow your mind that way)
They are not basilisks. Basilisks do real damage to you, they don’t make you nauseous. To get the general idea of the concept see RationalWiki. That illustrates what apparently constituted a basilisk for certain psychologically vulnerable individuals. Of course others would warn you that you shouldn’t discover what an alleged basilisk is on the off chance that it actually is one!
Guess I was going for stuff I wish I hadn’t seen rather than stuff that it would have actually been better not to know.
OK, I did a search on RationalWiki, and found this:http://rationalwiki.org/wiki/LessWrong#The_ugly which strikes me as odd.
Are there any useful links to stuff on this concept? I expect that being exposed to ideas that mess with your mind might be a good way to develop the ‘mental immune system’.
That particular event is a rather strange bit of early LW lore. It seems to have largely passed out of the site’s public consciousness now, but for a while it cast a long shadow: doing a site search on “forbidden topic” ought to give you an outline of the opinion surrounding it.
I’d advise against deliberately seeking out allegedly harmful knowledge in order to expose yourself to it: I’m aware of no particular evidence that the mind responds to memetic threats (as opposed to non-memetic stresses) by hardening itself against them, and with confirmation bias and group identification behavior in mind there are a number of reasons why that might not be the case. I should probably temper that by admitting I haven’t always followed my own advice here, though.
On the other hand, I do think there’s room for a more general theory of harmful knowledge. While some of the groundwork has been laid, and we have a few ad-hoc guidelines in place, we don’t yet have a good consensus on epistemic safety, as the comments on the World of Warcraft thread (to say nothing of this one!) demonstrate. About as close as I’ve seen anyone get is Nick Bostrom’s 2009 paper on information hazards, but it limits itself to typology. Contributing to such a theory might be a valuable thing to pursue, if you’re determined to risk your sanity.
As far as I can tell, people’s vulnerability to memetic hazards that drive some people but not others insane should be very predictable. Granted that there are problems with retrospectively changing one’s outlook to try and defend against some of them, it shouldn’t be too hard to test someone to see if they already have appropriate cached response defenses up without exposing them to the idea itself.
I don’t think I’d go as far as deliberately risking my sanity (such as it is).
So has knowledge that is harmful in more than specific situations been demonstrated to exist, or are you referring to theorising?
Depends what bounds you want to put on it. Basilisk-like knowledge (what the Bostrom paper calls a neuropsychological hazard) affecting the human cognitive architecture has not as far as I know been demonstrated to exist. Several other context-dependent but still fairly general informational hazards (ideological, for example) do clearly exist, though, and many of them seem poorly understood.
The forbidden topic in particular seems to belong to an interesting family of reflective hazards that hasn’t gotten much attention at all, although for the sake of local norms I’d rather not devote too much attention to it here.
Doh. Maybe I’m too tired so my brain is working less well than I’d hope, but I hadn’t noticed the link to the Bostrom paper there. I need to try to more carefully read through the stuff people say to me.
I’ll give the paper a read-through tomorrow.
[edit] I scanned the paper, but the tiny section on neuropsychological hazard seemed to tend toward the low-level (photosensitive epilepsy as one example), rather than the Lovecraftian (as I might have expected it to, if I had thought carefully about it, since I don’t place much credence in high-level ideas that could blow your mind that way)