Is there value in seeking out and confronting these limits,
Yes.
or should we exercise caution in our pursuit of knowledge?
Yes.
. . . to be less flippant: I think there’s an awkward kind of balance to be struck around the facts that
A) Most ideas which feel like they ‘should’ be dangerous aren’t[1].
B) “This information is dangerous” is a tell for would-be tyrants (and/or people just making kinda bad decisions out of intellectual laziness and fear of awkwardness).
but C) Basilisks aren’t not real, and people who grok A) and B) then have to work around the temptation to round it off to “knowledge isn’t dangerous, ever, under any circumstance” or at least “we should all pretend super hard that knowledge can’t be dangerous”.
D) Some information—“here’s a step-by-step-guide to engineering the next pandemic!”—is legitimately bad to have spread around even if it doesn’t harm the individual who knows it. (LWers distinguish between harmful-to-holder vs harmful-to-society with “infohazard” vs “exfohazard”.)
and E) It’s super difficult to predict what ideas will end up being a random person’s kryptonite. (Learning about factory farming as a child was not good for my mental health.)
I shouldn’t trusted with language right now.
I might be reading too much into this, but it sounds like you’re going through some stuff right now. The sensible/responsible/socially-scripted thing to say is “you should get some professional counseling about this”. The thing I actually want to say is “you should post about whatever’s bothering you on the appropriate 4chan board, being on that site is implicit consent for exposure to potential basilisks, I guarantee they’ve seen worse and weirder”. On reflection I tentatively endorse both of these suggestions, though I recognize they both have drawbacks.
Yes.
Yes.
. . . to be less flippant: I think there’s an awkward kind of balance to be struck around the facts that
A) Most ideas which feel like they ‘should’ be dangerous aren’t[1].
B) “This information is dangerous” is a tell for would-be tyrants (and/or people just making kinda bad decisions out of intellectual laziness and fear of awkwardness).
but C) Basilisks aren’t not real, and people who grok A) and B) then have to work around the temptation to round it off to “knowledge isn’t dangerous, ever, under any circumstance” or at least “we should all pretend super hard that knowledge can’t be dangerous”.
D) Some information—“here’s a step-by-step-guide to engineering the next pandemic!”—is legitimately bad to have spread around even if it doesn’t harm the individual who knows it. (LWers distinguish between harmful-to-holder vs harmful-to-society with “infohazard” vs “exfohazard”.)
and E) It’s super difficult to predict what ideas will end up being a random person’s kryptonite. (Learning about factory farming as a child was not good for my mental health.)
I might be reading too much into this, but it sounds like you’re going through some stuff right now. The sensible/responsible/socially-scripted thing to say is “you should get some professional counseling about this”. The thing I actually want to say is “you should post about whatever’s bothering you on the appropriate 4chan board, being on that site is implicit consent for exposure to potential basilisks, I guarantee they’ve seen worse and weirder”. On reflection I tentatively endorse both of these suggestions, though I recognize they both have drawbacks.
For what it’s worth, I’d bet large sums at long odds that whatever you’re currently thinking about falls into this category.