It’s great that you have that satnav. I worry about people like me. I worry about being incapable of leaving those thoughts alone until I’ve pulled the thread enough be sure I should ignore it. In other words, if I think there’s a chance something like that is true, I do want to trust the satnav, but I also want to be sure my “big if true” discovery genuinely isn’t true.
Of course, a good innoculation against this has been reading some intense blogs of people who’ve adopted alternative decision-theories which lead them down really scary paths to watch.
I worry “there but for the grace of chance go I.” But that’s not quite right, and being able to read that content and not go off the deep end myself is evidence that maybe my satnav is functioning just fine after all.
I suspect I’m talking about the same exact class of infohazard as mentioned here. I think I know what’s being veiled and have looked it in the eye.
Thanks for your excellent input! It’s not really the potential accuracy of such dark philosophies that I’m worried about here (though that is also an area of some concern, of course, since I am human and do have those anxieties on occasion), but rather how easy it seems to be to fall prey to and subsequently act on those infohazards for a certain subclass of extremely intelligent people. We’ve sadly had multiple cases in this community of smart people succumbing to thought-patterns which arguably (probably?) led to real-world deaths, but as far as I can tell, the damage has mostly been contained to individuals or small groups of people so far. The same cannot be said of some religious groups and cults, who have a history of falling prey to such ideologies (“everyone in outgroup x deserves death,” is a popular one). How concerned should we be about, say, philosophical infohazards leading to x-risk level conclusions [example removed]? I suspect natural human satnav/moral intuition leads to very few people being convinced by such arguments, but due to the tendency of people in rationalist (and religious!) spaces to deliberately rethink their intuition, there seems to be a higher risk in those subgroups for perverse eschatological ideologies. Is that risk high enough that active preventative measures should be taken, or is this concern itself of the 1+1=3, wrong-side-of-the-abyss type?
It’s great that you have that satnav. I worry about people like me. I worry about being incapable of leaving those thoughts alone until I’ve pulled the thread enough be sure I should ignore it. In other words, if I think there’s a chance something like that is true, I do want to trust the satnav, but I also want to be sure my “big if true” discovery genuinely isn’t true.
Of course, a good innoculation against this has been reading some intense blogs of people who’ve adopted alternative decision-theories which lead them down really scary paths to watch.
I worry “there but for the grace of chance go I.” But that’s not quite right, and being able to read that content and not go off the deep end myself is evidence that maybe my satnav is functioning just fine after all.
I suspect I’m talking about the same exact class of infohazard as mentioned here. I think I know what’s being veiled and have looked it in the eye.
Thanks for your excellent input! It’s not really the potential accuracy of such dark philosophies that I’m worried about here (though that is also an area of some concern, of course, since I am human and do have those anxieties on occasion), but rather how easy it seems to be to fall prey to and subsequently act on those infohazards for a certain subclass of extremely intelligent people. We’ve sadly had multiple cases in this community of smart people succumbing to thought-patterns which arguably (probably?) led to real-world deaths, but as far as I can tell, the damage has mostly been contained to individuals or small groups of people so far. The same cannot be said of some religious groups and cults, who have a history of falling prey to such ideologies (“everyone in outgroup x deserves death,” is a popular one). How concerned should we be about, say, philosophical infohazards leading to x-risk level conclusions [example removed]? I suspect natural human satnav/moral intuition leads to very few people being convinced by such arguments, but due to the tendency of people in rationalist (and religious!) spaces to deliberately rethink their intuition, there seems to be a higher risk in those subgroups for perverse eschatological ideologies. Is that risk high enough that active preventative measures should be taken, or is this concern itself of the 1+1=3, wrong-side-of-the-abyss type?