But the fact that you feel compelled to say that says something worrying about the state of our Society, right? It should really just go without saying—to anyone who actually thinks about the matter for a minute—that when someone on a no-barriers-to-entry free-to-sign-up internet forum asks for examples of unpopular opinions, then someone is going to post a terrible opinion that most other commenters will strongly disagree with (because it’s terrible). If, empirically, it doesn’t go without saying, that would seem to suggest that people feel the need to make the forum as a whole accountable to mob punishment mechanisms that are less discerning than anyone who actually thinks about the matter for a minute. But I continue to worry that that level of ambient social pressure is really bad for our collective epistemology, even if the particular opinion that we feel obligated to condemn in some particular case is, in fact, worthy of being condemned.
Like, without defending the text of the grandparent (Anderson pretty obviously has a normative agenda to push; my earlier comment was probably too charitable), the same sorts of general skills of thinking that we need to solve AI alignment, should also be able be able to cope with empirical hypotheses of the form, “These-and-such psychological sex differences in humans (with effect size Cohen’s d equalling blah) have such-and-these sociological consequences.”
Probably that discussion shouldn’t take place on Less Wrong proper (too far off-topic), but if there is to be such a thing as an art of rationality, the smart serious version of the discussion—the version that refutes misogynistically-motivated idiocy while simultaneously explaining whatever relevant structure-in-the-world some misogynistic idiots are nevertheless capable of perceiving—needs to happen somewhere. If none of the smart serious people can do it because we’re terrified that the media (or Twitter, or /r/SneerClub) can’t tell the difference between us and Stuart Anderson, then we’re dead. I just don’t think that level of cowardice is compatible with the amount of intellectual flexibility that we need to save the world.
The immortal Scott Alexander wrote, “you can’t have a mind that questions the stars but never thinks to question the Bible.” Similarly, I don’t think you can have a mind that designs a recursively self-improving aligned superintelligence (!!) that has to rely on no-platforming tactics rather than calmly, carefully, objectively describing in detail the specific ways in which the speaker’s cognitive algorithms are failing to maximize the probability they assign to the actual outcome.
If none of the smart serious people can do it because we’re terrified that the media (or Twitter, or /r/SneerClub) can’t tell the difference between us and Stuart Anderson, then we’re dead.
The cynical hypothesis is that the media (or Twitter, or /r/SneerClub) fundamentally do not care about the difference between us and Stuart Anderson, and even if they can tell the difference, it doesn’t matter.
But more importantly—
… if there is to be such a thing as an art of rationality, the smart serious version of the discussion … needs to happen somewhere.
Suppose you were asked to briefly describe what such a place (which would, by construction, not be Less Wrong) would be like—what would you say?
Invite-only private email list that publishes highlights to a pseudonymous blog with no comment section.
You might ask, why aren’t people already doing this? I think the answer is going to be some weighted combination of (a) they’re worthless cowards, and (b) the set of things you can’t say, and the distortionary effect of recursive lies, just aren’t that large, such that they don’t perceive the need to bother.
There are reasons I might be biased to put too much weight on (a). Sorry.
(c) unpopular ideas hurt each other by association, (d) it’s hard to find people who can be trusted to have good unpopular ideas but not bad unpopular ideas, (e) people are motivated by getting credit for their ideas, (f) people don’t seem good at group writing curation generally
Thanks. (e) is very important: that’s a large part of why my special-purpose pen name ended up being a mere “differential visibility” pseudonym (for a threat-model where the first page of my real-name Google results matters because of casual searches by future employers) rather than an Actually Secret pseudonym. (There are other threat models that demand more Actual Secrecy, but I’m not defending against those because I’m not that much of a worthless coward.)
I currently don’t have a problem with (d), but I agree that it’s probably true in general (and I’m just lucky to have such awesome friends).
I think people underestimate the extent to which (c) is a contingent self-fulfilling prophecy rather than a fixed fact of nature. You can read the implied social attack in (a) as an attempt to push against the current equilibrium.
Suppose you were asked to briefly describe what such a place (which would, by construction, not be Less Wrong) would be like—what would you say?
I’m a big fan of in-person conversation. I think it’s entirely possible to save the world without needing to be able to talk about anything you want online in a public forum.
As per my other comment—is it the “public” part that you feel is critical here, or the “online” part, or are they both separately necessary (and if so—are they together sufficient? … though this is a much trickier question, of course).
There’s a false dilemma there, though. “In-person conversation” and “online public forum” are surely not the only possibilities. At the very least, “private online forum” is another option, yes?
You don’t need to denounce someone that’s demonstrably wrong, you just point out how they’re wrong.
I think you’re misunderstanding the implications of the heresy dynamic. It’s true that people who want to maintain their good standing within the dominant ideology—in the Cathedral, we could say, since you seem to be a Moldbug fan—can’t honestly engage with the heretic’s claims. That doesn’t imply that the heretic’s claims are correct—they just have to be not so trivially wrong as to permit a demonstration of their wrongness that doesn’t require the work of intellectually honest engagement (which the pious cannot permit themselves).
If a Bad Man says that 2+2=5, then good people can demonstrate the arithmetic error without pounding the table and denouncing him as a Bad Man. If a Bad Man claims that P equals NP, then good people who want the Bad Man gone but wouldn’t be caught dead actually checking the proof, are reduced to pounding the table—but that doesn’t mean the proof is correct! Reversed stupidity is not intelligence.
What exactly do people think is the endgame of denunciation?
Evading punishment of non-punishers. Good people who don’t shun Bad Men might fall under suspicion of being Bad Men themselves.
I had hoped that people would be more rational and less pissed off, but you win some you lose some.
I know the feeling.
The evolutionary need for sexual dimorphism will disappear, evolution will take care of the rest.
Um. You may be underestimating the timescale on which evolution works? (The evolution of sexual dimorphism is even slower!)
I specifically said I offered no solution in that post.
That’s a start, but if you’re interested in writing advice, I would recommend trying a lot harder to signal that you really understand the is/ought distinction. (You’re doing badly enough at this that I’m not convinced you do.) You’ve been pointing to some real patterns, but when your top-line summary is “Women’s agency [...] is contrary to a society’s progress and stability” … that’s not going to play in Berkeley. (And for all of their/our other failings, a lot of people in Berkeley are very smart and have read the same blogs as you, and more—even if they’re strategic about when to show it.)
But the fact that you feel compelled to say that says something worrying about the state of our Society, right? It should really just go without saying—to anyone who actually thinks about the matter for a minute—that when someone on a no-barriers-to-entry free-to-sign-up internet forum asks for examples of unpopular opinions, then someone is going to post a terrible opinion that most other commenters will strongly disagree with (because it’s terrible). If, empirically, it doesn’t go without saying, that would seem to suggest that people feel the need to make the forum as a whole accountable to mob punishment mechanisms that are less discerning than anyone who actually thinks about the matter for a minute. But I continue to worry that that level of ambient social pressure is really bad for our collective epistemology, even if the particular opinion that we feel obligated to condemn in some particular case is, in fact, worthy of being condemned.
Like, without defending the text of the grandparent (Anderson pretty obviously has a normative agenda to push; my earlier comment was probably too charitable), the same sorts of general skills of thinking that we need to solve AI alignment, should also be able be able to cope with empirical hypotheses of the form, “These-and-such psychological sex differences in humans (with effect size Cohen’s d equalling blah) have such-and-these sociological consequences.”
Probably that discussion shouldn’t take place on Less Wrong proper (too far off-topic), but if there is to be such a thing as an art of rationality, the smart serious version of the discussion—the version that refutes misogynistically-motivated idiocy while simultaneously explaining whatever relevant structure-in-the-world some misogynistic idiots are nevertheless capable of perceiving—needs to happen somewhere. If none of the smart serious people can do it because we’re terrified that the media (or Twitter, or /r/SneerClub) can’t tell the difference between us and Stuart Anderson, then we’re dead. I just don’t think that level of cowardice is compatible with the amount of intellectual flexibility that we need to save the world.
The immortal Scott Alexander wrote, “you can’t have a mind that questions the stars but never thinks to question the Bible.” Similarly, I don’t think you can have a mind that designs a recursively self-improving aligned superintelligence (!!) that has to rely on no-platforming tactics rather than calmly, carefully, objectively describing in detail the specific ways in which the speaker’s cognitive algorithms are failing to maximize the probability they assign to the actual outcome.
The cynical hypothesis is that the media (or Twitter, or /r/SneerClub) fundamentally do not care about the difference between us and Stuart Anderson, and even if they can tell the difference, it doesn’t matter.
But more importantly—
Suppose you were asked to briefly describe what such a place (which would, by construction, not be Less Wrong) would be like—what would you say?
Invite-only private email list that publishes highlights to a pseudonymous blog with no comment section.
You might ask, why aren’t people already doing this? I think the answer is going to be some weighted combination of (a) they’re worthless cowards, and (b) the set of things you can’t say, and the distortionary effect of recursive lies, just aren’t that large, such that they don’t perceive the need to bother.
There are reasons I might be biased to put too much weight on (a). Sorry.
(c) unpopular ideas hurt each other by association, (d) it’s hard to find people who can be trusted to have good unpopular ideas but not bad unpopular ideas, (e) people are motivated by getting credit for their ideas, (f) people don’t seem good at group writing curation generally
Thanks. (e) is very important: that’s a large part of why my special-purpose pen name ended up being a mere “differential visibility” pseudonym (for a threat-model where the first page of my real-name Google results matters because of casual searches by future employers) rather than an Actually Secret pseudonym. (There are other threat models that demand more Actual Secrecy, but I’m not defending against those because I’m not that much of a worthless coward.)
I currently don’t have a problem with (d), but I agree that it’s probably true in general (and I’m just lucky to have such awesome friends).
I think people underestimate the extent to which (c) is a contingent self-fulfilling prophecy rather than a fixed fact of nature. You can read the implied social attack in (a) as an attempt to push against the current equilibrium.
I’m a big fan of in-person conversation. I think it’s entirely possible to save the world without needing to be able to talk about anything you want online in a public forum.
I disagree.
(I mean, ‘possible’ is a weak word, many things are possible, but I think it’s the sort of massive handicap that I’m not sure how to get around.)
As per my other comment—is it the “public” part that you feel is critical here, or the “online” part, or are they both separately necessary (and if so—are they together sufficient? … though this is a much trickier question, of course).
There’s a false dilemma there, though. “In-person conversation” and “online public forum” are surely not the only possibilities. At the very least, “private online forum” is another option, yes?
-
I think you’re misunderstanding the implications of the heresy dynamic. It’s true that people who want to maintain their good standing within the dominant ideology—in the Cathedral, we could say, since you seem to be a Moldbug fan—can’t honestly engage with the heretic’s claims. That doesn’t imply that the heretic’s claims are correct—they just have to be not so trivially wrong as to permit a demonstration of their wrongness that doesn’t require the work of intellectually honest engagement (which the pious cannot permit themselves).
If a Bad Man says that 2+2=5, then good people can demonstrate the arithmetic error without pounding the table and denouncing him as a Bad Man. If a Bad Man claims that P equals NP, then good people who want the Bad Man gone but wouldn’t be caught dead actually checking the proof, are reduced to pounding the table—but that doesn’t mean the proof is correct! Reversed stupidity is not intelligence.
Evading punishment of non-punishers. Good people who don’t shun Bad Men might fall under suspicion of being Bad Men themselves.
I know the feeling.
Um. You may be underestimating the timescale on which evolution works? (The evolution of sexual dimorphism is even slower!)
That’s a start, but if you’re interested in writing advice, I would recommend trying a lot harder to signal that you really understand the is/ought distinction. (You’re doing badly enough at this that I’m not convinced you do.) You’ve been pointing to some real patterns, but when your top-line summary is “Women’s agency [...] is contrary to a society’s progress and stability” … that’s not going to play in Berkeley. (And for all of their/our other failings, a lot of people in Berkeley are very smart and have read the same blogs as you, and more—even if they’re strategic about when to show it.)