(If you feel this is a compromising question, feel free to not answer)
What are the costs that you’ve seen come with the “avoid an entire topic” approach? For myself, I can imagine some topics that contain sensitive information, but that I’d also feel incredibly constrained to not talk about. I’m wondering if you don’t experience many costs, or if you feel that’s whatever costs you incur are just the price of keeping information secret.
Thanks for explicitly giving me the out not to answer, I think that’s ‘doing it right’ here.
Not being able to talk about things really sucks! Especially because the things you’re actually thinking about a lot, and are the most interesting to you, are more likely to include information you can’t share, for various reasons.
On the flip side, there are also topics one can’t talk about because of worry that it would expose information about one’s opinions rather than secret facts. This can be annoying, but it’s also a good way to avoid things that you should, for plenty of other reasons, know better than to waste one’s time on!
As someone who uses this strategy as their default, It’s really hard. I avoid talking irl about rationality and being trans. The latter is pretty easy since it’s not really a big part of who I am. But avoiding any hint of rationality and maintaining a mask of normality is exhausting. It’s not just not answering questions, it’s about not creating situations that lead to the questions. It is said in the sequences that if you tell one lie, the truth is ever after your enemy. That’s an exaggeration but not by much. AI, aging, genetics, all sorts of things are dangerous topics due to their proximity to my weirdness. I have to model the reactions to everything I say one or two steps ahead and if I get it wrong I have to evade or misdirect. This has gotten a lot harder since I started studying rationality and had my head stuffed full of exciting concepts that are difficult to explain and sparkly enough to be difficult to think past.
It should be obvious from this that I don’t practice honesty in general, but I usually answer a direct question with honesty to mitigate the costs somewhat.
Less visible costs are that I’ll never meet a rationalist in real life (barring intentional meetups). I get to practice the virtue of argument a lot less… although being cut off from people has some serious advantages as well. There’s probably others, but what’s the alternative? Not everyone can be Yudkowsky and I just want to live my life in peace.
(If you feel this is a compromising question, feel free to not answer)
What are the costs that you’ve seen come with the “avoid an entire topic” approach? For myself, I can imagine some topics that contain sensitive information, but that I’d also feel incredibly constrained to not talk about. I’m wondering if you don’t experience many costs, or if you feel that’s whatever costs you incur are just the price of keeping information secret.
Thanks for explicitly giving me the out not to answer, I think that’s ‘doing it right’ here.
Not being able to talk about things really sucks! Especially because the things you’re actually thinking about a lot, and are the most interesting to you, are more likely to include information you can’t share, for various reasons.
On the flip side, there are also topics one can’t talk about because of worry that it would expose information about one’s opinions rather than secret facts. This can be annoying, but it’s also a good way to avoid things that you should, for plenty of other reasons, know better than to waste one’s time on!
As someone who uses this strategy as their default, It’s really hard. I avoid talking irl about rationality and being trans. The latter is pretty easy since it’s not really a big part of who I am. But avoiding any hint of rationality and maintaining a mask of normality is exhausting. It’s not just not answering questions, it’s about not creating situations that lead to the questions. It is said in the sequences that if you tell one lie, the truth is ever after your enemy. That’s an exaggeration but not by much. AI, aging, genetics, all sorts of things are dangerous topics due to their proximity to my weirdness. I have to model the reactions to everything I say one or two steps ahead and if I get it wrong I have to evade or misdirect. This has gotten a lot harder since I started studying rationality and had my head stuffed full of exciting concepts that are difficult to explain and sparkly enough to be difficult to think past.
It should be obvious from this that I don’t practice honesty in general, but I usually answer a direct question with honesty to mitigate the costs somewhat.
Less visible costs are that I’ll never meet a rationalist in real life (barring intentional meetups). I get to practice the virtue of argument a lot less… although being cut off from people has some serious advantages as well. There’s probably others, but what’s the alternative? Not everyone can be Yudkowsky and I just want to live my life in peace.