In practice, I’ve found that it’s possible to keep a lot of information secret without the need to either lie, or do lots of extra Glomarizing to avoid the act of Glomarizing giving too much away. Rather than have a stated, deterministic solution to each possible question or problem, I do what seems practical given the situation and who is asking.
One tactic I’ve found necessary is to just not talk about entire topics, or not write entire posts, that would put me in a position where I’d be backed into a corner and Glomarizing would be too suspicious to pull off without giving the game away, and of course not talking about which posts/topics those are. The alternative, to do sufficient Glomarizing ‘in the open,’ would cut off a lot more discussion/information on net and also be more socially costly.
In general, there’s a temptation to do things that are game-theoretically robust—where if they could see your source code and decision algorithm, you’d still be all right, or at least do as well as possible given the circumstances. This is of course hugely important to various scenarios important for AI, where you actually do face such circumstances. But in reality, it’s usually right to do things that are hugely exploitable if they could see what you’re up to—e.g. to not Glomarize ‘enough’ even though that opens you up to problems.
(If you feel this is a compromising question, feel free to not answer)
What are the costs that you’ve seen come with the “avoid an entire topic” approach? For myself, I can imagine some topics that contain sensitive information, but that I’d also feel incredibly constrained to not talk about. I’m wondering if you don’t experience many costs, or if you feel that’s whatever costs you incur are just the price of keeping information secret.
Thanks for explicitly giving me the out not to answer, I think that’s ‘doing it right’ here.
Not being able to talk about things really sucks! Especially because the things you’re actually thinking about a lot, and are the most interesting to you, are more likely to include information you can’t share, for various reasons.
On the flip side, there are also topics one can’t talk about because of worry that it would expose information about one’s opinions rather than secret facts. This can be annoying, but it’s also a good way to avoid things that you should, for plenty of other reasons, know better than to waste one’s time on!
As someone who uses this strategy as their default, It’s really hard. I avoid talking irl about rationality and being trans. The latter is pretty easy since it’s not really a big part of who I am. But avoiding any hint of rationality and maintaining a mask of normality is exhausting. It’s not just not answering questions, it’s about not creating situations that lead to the questions. It is said in the sequences that if you tell one lie, the truth is ever after your enemy. That’s an exaggeration but not by much. AI, aging, genetics, all sorts of things are dangerous topics due to their proximity to my weirdness. I have to model the reactions to everything I say one or two steps ahead and if I get it wrong I have to evade or misdirect. This has gotten a lot harder since I started studying rationality and had my head stuffed full of exciting concepts that are difficult to explain and sparkly enough to be difficult to think past.
It should be obvious from this that I don’t practice honesty in general, but I usually answer a direct question with honesty to mitigate the costs somewhat.
Less visible costs are that I’ll never meet a rationalist in real life (barring intentional meetups). I get to practice the virtue of argument a lot less… although being cut off from people has some serious advantages as well. There’s probably others, but what’s the alternative? Not everyone can be Yudkowsky and I just want to live my life in peace.
In practice, I’ve found that it’s possible to keep a lot of information secret without the need to either lie, or do lots of extra Glomarizing to avoid the act of Glomarizing giving too much away. Rather than have a stated, deterministic solution to each possible question or problem, I do what seems practical given the situation and who is asking.
One tactic I’ve found necessary is to just not talk about entire topics, or not write entire posts, that would put me in a position where I’d be backed into a corner and Glomarizing would be too suspicious to pull off without giving the game away, and of course not talking about which posts/topics those are. The alternative, to do sufficient Glomarizing ‘in the open,’ would cut off a lot more discussion/information on net and also be more socially costly.
In general, there’s a temptation to do things that are game-theoretically robust—where if they could see your source code and decision algorithm, you’d still be all right, or at least do as well as possible given the circumstances. This is of course hugely important to various scenarios important for AI, where you actually do face such circumstances. But in reality, it’s usually right to do things that are hugely exploitable if they could see what you’re up to—e.g. to not Glomarize ‘enough’ even though that opens you up to problems.
(If you feel this is a compromising question, feel free to not answer)
What are the costs that you’ve seen come with the “avoid an entire topic” approach? For myself, I can imagine some topics that contain sensitive information, but that I’d also feel incredibly constrained to not talk about. I’m wondering if you don’t experience many costs, or if you feel that’s whatever costs you incur are just the price of keeping information secret.
Thanks for explicitly giving me the out not to answer, I think that’s ‘doing it right’ here.
Not being able to talk about things really sucks! Especially because the things you’re actually thinking about a lot, and are the most interesting to you, are more likely to include information you can’t share, for various reasons.
On the flip side, there are also topics one can’t talk about because of worry that it would expose information about one’s opinions rather than secret facts. This can be annoying, but it’s also a good way to avoid things that you should, for plenty of other reasons, know better than to waste one’s time on!
As someone who uses this strategy as their default, It’s really hard. I avoid talking irl about rationality and being trans. The latter is pretty easy since it’s not really a big part of who I am. But avoiding any hint of rationality and maintaining a mask of normality is exhausting. It’s not just not answering questions, it’s about not creating situations that lead to the questions. It is said in the sequences that if you tell one lie, the truth is ever after your enemy. That’s an exaggeration but not by much. AI, aging, genetics, all sorts of things are dangerous topics due to their proximity to my weirdness. I have to model the reactions to everything I say one or two steps ahead and if I get it wrong I have to evade or misdirect. This has gotten a lot harder since I started studying rationality and had my head stuffed full of exciting concepts that are difficult to explain and sparkly enough to be difficult to think past.
It should be obvious from this that I don’t practice honesty in general, but I usually answer a direct question with honesty to mitigate the costs somewhat.
Less visible costs are that I’ll never meet a rationalist in real life (barring intentional meetups). I get to practice the virtue of argument a lot less… although being cut off from people has some serious advantages as well. There’s probably others, but what’s the alternative? Not everyone can be Yudkowsky and I just want to live my life in peace.