Agree with others on this being a well-put together post, both clearly tying together several related concepts and pushing the surrounding conversation forward. I like ‘blatant cherrypicking is the best kind’ as an encapsulation of the concept.
Some high level thoughts here:
It’s better to share cherry-picking-algorithm at a high enough level that allows people to predict future things you might suddenly not be able to talk about in a few years. (i.e. I can list the things that I don’t currently talk about, but in a year a new issue might get politicized, that I didn’t think to spell out in advance. And sharing “I don’t talk about X” right as X is getting politicized might be particularly socially hazardous. It’s also just costly, in time, to periodically update your taboos, and costly in other people’s time to constantly be checking up on it)
Ideally, if it’s only OTHER people we’re worried about social harm from (i.e. non-aspiring-espistemic-rationalists), we still get to talk about the thing to build a fully integrated worldmodel. One property that a Citadel of Truth should have is actually keeping things private from the outside world. (This is a solveable logistical problem, although you do have to actually solve it. It might be good for LW to enable posts to be hidden from non-logged out users, perhaps requiring some karma threshold to see taboo posts)
The hardest of hard modes is “local politics”, where it’s not just that I’m worried about nebulous “outsiders” hurting me (or friends feeling pressure to disown me because they in turn face pressure from outsiders). Instead, the issue is politics inside the citadel. It seems like a quite desirable property to able to talk freely about which local orgs and people deserve money and prestige – but I don’t currently know of robust game mechanics that will actually, reliably enable this in any environment where I don’t personally know and trust each person.
(Having multiple “inner citadels” of trust is sort of the de-facto way this is done currently, in my experience. Having clearer signposting on how to get trustworthy might be a good improvement. Notably, proclaiming “I only care about truth, not politics” is not sufficient for me to trust someone in this domain.)
Some concrete “cherry-picking-algorithm-sharings”
I’m happy, at this present moment, to note:
I generally avoid talking online about mainstream national political issues (gender and social justice issues in particular) unless they cross a particular threshold for “relevant to EA concerns”, or “personally important to me”.
(For example, climate change is politicized, but it’s pretty important to fit it into my overall catastrophic risk ontology, and worth paying the cost for. I don’t talk about it that often, partly because it still has a slightly-disproportionate cost*, and partly because it just turns out climate change isn’t that important after having thought about it in EA context)
It currently so happens that gender/social-justice issues are both “national politics” and also “local politics”, which makes them additionally hard to talk openly about. But this is a property of the particular year, and different issues might be more sensitive in a couple years. If you want to know which things I can’t say in a couple years, and I haven’t written another such comment in the meanwhile, you may need to do interpretive labor and careful observation. Which sucks, and I’m sorry.
*the climate change cost isn’t just in “people might ostracize me” (I’m not actually worried about that), but “random people are less likely to be thinking clearly, and the conversation quality is often going to be worse”.
There’s an interesting thing where abortion becomes a much more interesting topic to me now that I have more nuanced views of moral uncertainty and trade. But this is a topic that I think makes most sense to discuss “within the citadel” (where “within the citadel” means “I’m reasonably confident randos from the culture war won’t show up, and/or start making life difficult for me or causes I care about”. Especially because most of the value isn’t related to abortion itself, but integrating a worldview that includes abortion as well as digital uploads and low-grade simulations)
Local Politics
Local politics (i.e. facts pertinent to “who get money/prestige in the rationalsphere”) has the double-edged property of “being much more important to talk about” and “being harder and anti-inductive to talk about”. This post has provided a good reminder for me to figure out some thoughts here and write them up. I think they’re beyond scope for this comment though.
Final note:
None of these are things that I don’t talk about period, they are just things that are disproportionately costly to talk about, so I don’t talk about them freely without putting some thought into it and making sure it’s worth it.
Update: in the past year, since writing this comment, I periodically thought ‘man, I should probably post some somewhat controversial opinions, to get into the habit of actually having the backbone to do that sometimes.’
And… well, I haven’t gotten around to it. And I think there at least is a little bit of “Ray is just being cowardly here”, but, also… it just doesn’t feel super worth it. It’s effortful to write up stuff in the first place, and I have a ton of stuff I want to write that isn’t controversial which I think is really important, and meanwhile I don’t think on the object level that any given controversial opinion was really that valuable to express except for building up my social resilience backbone.
I think “build up social resilience” is actually pretty important, enough that I lean towards “yes actually feel bad about this even though any given controversial opinion doesn’t feel very important to express”. But, whelp, here we are.
Agree with others on this being a well-put together post, both clearly tying together several related concepts and pushing the surrounding conversation forward. I like ‘blatant cherrypicking is the best kind’ as an encapsulation of the concept.
Some high level thoughts here:
It’s better to share cherry-picking-algorithm at a high enough level that allows people to predict future things you might suddenly not be able to talk about in a few years. (i.e. I can list the things that I don’t currently talk about, but in a year a new issue might get politicized, that I didn’t think to spell out in advance. And sharing “I don’t talk about X” right as X is getting politicized might be particularly socially hazardous. It’s also just costly, in time, to periodically update your taboos, and costly in other people’s time to constantly be checking up on it)
Ideally, if it’s only OTHER people we’re worried about social harm from (i.e. non-aspiring-espistemic-rationalists), we still get to talk about the thing to build a fully integrated worldmodel. One property that a Citadel of Truth should have is actually keeping things private from the outside world. (This is a solveable logistical problem, although you do have to actually solve it. It might be good for LW to enable posts to be hidden from non-logged out users, perhaps requiring some karma threshold to see taboo posts)
The hardest of hard modes is “local politics”, where it’s not just that I’m worried about nebulous “outsiders” hurting me (or friends feeling pressure to disown me because they in turn face pressure from outsiders). Instead, the issue is politics inside the citadel. It seems like a quite desirable property to able to talk freely about which local orgs and people deserve money and prestige – but I don’t currently know of robust game mechanics that will actually, reliably enable this in any environment where I don’t personally know and trust each person.
(Having multiple “inner citadels” of trust is sort of the de-facto way this is done currently, in my experience. Having clearer signposting on how to get trustworthy might be a good improvement. Notably, proclaiming “I only care about truth, not politics” is not sufficient for me to trust someone in this domain.)
Some concrete “cherry-picking-algorithm-sharings”
I’m happy, at this present moment, to note:
I generally avoid talking online about mainstream national political issues (gender and social justice issues in particular) unless they cross a particular threshold for “relevant to EA concerns”, or “personally important to me”.
(For example, climate change is politicized, but it’s pretty important to fit it into my overall catastrophic risk ontology, and worth paying the cost for. I don’t talk about it that often, partly because it still has a slightly-disproportionate cost*, and partly because it just turns out climate change isn’t that important after having thought about it in EA context)
It currently so happens that gender/social-justice issues are both “national politics” and also “local politics”, which makes them additionally hard to talk openly about. But this is a property of the particular year, and different issues might be more sensitive in a couple years. If you want to know which things I can’t say in a couple years, and I haven’t written another such comment in the meanwhile, you may need to do interpretive labor and careful observation. Which sucks, and I’m sorry.
*the climate change cost isn’t just in “people might ostracize me” (I’m not actually worried about that), but “random people are less likely to be thinking clearly, and the conversation quality is often going to be worse”.
There’s an interesting thing where abortion becomes a much more interesting topic to me now that I have more nuanced views of moral uncertainty and trade. But this is a topic that I think makes most sense to discuss “within the citadel” (where “within the citadel” means “I’m reasonably confident randos from the culture war won’t show up, and/or start making life difficult for me or causes I care about”. Especially because most of the value isn’t related to abortion itself, but integrating a worldview that includes abortion as well as digital uploads and low-grade simulations)
Local Politics
Local politics (i.e. facts pertinent to “who get money/prestige in the rationalsphere”) has the double-edged property of “being much more important to talk about” and “being harder and anti-inductive to talk about”. This post has provided a good reminder for me to figure out some thoughts here and write them up. I think they’re beyond scope for this comment though.
Final note:
None of these are things that I don’t talk about period, they are just things that are disproportionately costly to talk about, so I don’t talk about them freely without putting some thought into it and making sure it’s worth it.
Update: in the past year, since writing this comment, I periodically thought ‘man, I should probably post some somewhat controversial opinions, to get into the habit of actually having the backbone to do that sometimes.’
And… well, I haven’t gotten around to it. And I think there at least is a little bit of “Ray is just being cowardly here”, but, also… it just doesn’t feel super worth it. It’s effortful to write up stuff in the first place, and I have a ton of stuff I want to write that isn’t controversial which I think is really important, and meanwhile I don’t think on the object level that any given controversial opinion was really that valuable to express except for building up my social resilience backbone.
I think “build up social resilience” is actually pretty important, enough that I lean towards “yes actually feel bad about this even though any given controversial opinion doesn’t feel very important to express”. But, whelp, here we are.