Get a campfire or something and a notepad and pencil. Write down on the pad something you think is probably true, and important, but which you wouldn’t say in public due to fear of how others would react. Then tear off that piece of paper and toss it in the fire. Repeat this process as many times as you can for five minutes; this is a brainstorming session, so your metric for success is how many diverse ideas you have multiplied by their average quality.
Next, repeat the above except instead of “you wouldn’t say in public...” it’s “you wouldn’t say to anyone.”
Next, repeat the above except instead of “you wouldn’t say...” it’s “you are afraid even to think/admit to yourself.”
Next, repeat the above except instead of “you think is probably true and important” it’s “you think is important, and probably not true, but maaaaybe true and if it was true it would be hard for you to admit to yourself.”
I’ve never done this entire ritual but I plan to try someday unless someone dissuades me. (I’m interested to hear suggestions on how to improve the idea. Maybe writing things down is worse than just thinking about them in your head, for example. Or maybe there should be some step at the end that makes you feel better since the default outcome might be that you feel like shit.) I’ve done mini-versions in my head sometimes while daydreaming and found it helpful.
General theory of why this is helpful:
Humans deceive themselves all the time. We deceive ourselves so as to be better at deceiving others. Many of our beliefs are there because they make us look good or feel good.
However, if you want to see the world clearly and believe true things (which isn’t for everyone, but may be for you. Think about it. Given what I said above it’s not a trivial choice) then maybe this ritual can help you do that, because it puts your brain in a state where it is psychologically safe to admit those hard truths.
The main reasons why I’d see something potentially falling into category 3+ (maybe 2 also) are either a) threat models where I am observed far more than otherwise expected or b) threat models where cognitohazards exist.
...which for a) leads to “write something on a piece of paper and throw it in the fire” also being insecure, and for b) leads to “thinking of it is a bad idea regardless of what you do after”.
I think you are underestimating how much I think falls into these categories. I suspect (although I do not know) that much of what you would call being dishonest to oneself I would categorize into a) or b).
(General PSA: although choosing a career that encourages you to develop your natural tendencies can be a good thing, it also has downsides. Being someone who is on the less trusting side of things at the best of times and works in embedded hardware with an eye toward security… I am rather acutely aware of how[1] much[2] information[3] leakage[4] there is from e.g. the phone in your pocket. Typical English writing speed is ~13 WPM[5]. English text has ~9.83 bits of entropy / word[6]. That’s only, what, 2.1 bits / second? That’s tiny[7][8].)
(I don’t tend to like the label, mainly because of the connotations, but the best description might be ‘functionally paranoid’. I’m the sort of person who reads the IT policy at work, notes that it allows searches of personal devices, and then never brings a data storage device of any sort to work as a result.)
Good point about cognitohazards. I’d say: Beware self-fulfilling prophecies.
https://eprint.iacr.org/2021/1064.pdf (“whoops, brightness is a sidechannel attack to recover audio because power supplies aren’t perfect”. Not typically directly applicable for a phone, but still interesting.)
Or rather, modern data storage is huge, and modern data transmission is quick. It doesn’t take much correlation at all to inadvertently leak a bit or two per second when you’re transmitting millions (or billions) of bits a second. (And that’s assuming P(malice)=0…)
This sort of calculation is also why I don’t put too much weight in encryption schemes with longer-lived keys actually being secure. If you have a 512b key that you keep around for a month, you only need to leak a bit an hour to leak the key… and data storage is cheap enough that just because someone didn’t go through the work today to figure out how to recover the info, doesn’t mean they can’t pull it up a year or decade from now when they have better or more automated tooling. There’s a side question as to if anyone cares enough to do so… but another side question of if that question matters if the collection and tooling are automated.
Ah, I guess I was assuming that things in a are not in b, things in b are not in c, etc. although as-written pretty much anything in a later category would also be in all the earlier categories. Things you aren’t willing to say to anyone, you aren’t willing to say in public. Etc.
Example self-fulfilling prophecy: “I’m too depressed to be useful; I should just withdraw so as to not be a burden on anybody.” For some people it’s straightforwardly false, for others its straightforwardly true, but for some it’s true iff they think it is.
Random idea: Hard Truths Ritual:
Get a campfire or something and a notepad and pencil. Write down on the pad something you think is probably true, and important, but which you wouldn’t say in public due to fear of how others would react. Then tear off that piece of paper and toss it in the fire. Repeat this process as many times as you can for five minutes; this is a brainstorming session, so your metric for success is how many diverse ideas you have multiplied by their average quality.
Next, repeat the above except instead of “you wouldn’t say in public...” it’s “you wouldn’t say to anyone.”
Next, repeat the above except instead of “you wouldn’t say...” it’s “you are afraid even to think/admit to yourself.”
Next, repeat the above except instead of “you think is probably true and important” it’s “you think is important, and probably not true, but maaaaybe true and if it was true it would be hard for you to admit to yourself.”
I’ve never done this entire ritual but I plan to try someday unless someone dissuades me. (I’m interested to hear suggestions on how to improve the idea. Maybe writing things down is worse than just thinking about them in your head, for example. Or maybe there should be some step at the end that makes you feel better since the default outcome might be that you feel like shit.) I’ve done mini-versions in my head sometimes while daydreaming and found it helpful.
General theory of why this is helpful:
Humans deceive themselves all the time. We deceive ourselves so as to be better at deceiving others. Many of our beliefs are there because they make us look good or feel good.
However, if you want to see the world clearly and believe true things (which isn’t for everyone, but may be for you. Think about it. Given what I said above it’s not a trivial choice) then maybe this ritual can help you do that, because it puts your brain in a state where it is psychologically safe to admit those hard truths.
Interesting.
The main reasons why I’d see something potentially falling into category 3+ (maybe 2 also) are either a) threat models where I am observed far more than otherwise expected or b) threat models where cognitohazards exist.
...which for a) leads to “write something on a piece of paper and throw it in the fire” also being insecure, and for b) leads to “thinking of it is a bad idea regardless of what you do after”.
It sounds like you are saying you are unusually honest with yourself, much more than most humans. Yes?
Good point about cognitohazards. I’d say: Beware self-fulfilling prophecies.
I think you are underestimating how much I think falls into these categories. I suspect (although I do not know) that much of what you would call being dishonest to oneself I would categorize into a) or b).
(General PSA: although choosing a career that encourages you to develop your natural tendencies can be a good thing, it also has downsides. Being someone who is on the less trusting side of things at the best of times and works in embedded hardware with an eye toward security… I am rather acutely aware of how[1] much[2] information[3] leakage[4] there is from e.g. the phone in your pocket. Typical English writing speed is ~13 WPM[5]. English text has ~9.83 bits of entropy / word[6]. That’s only, what, 2.1 bits / second? That’s tiny[7][8].)
(I don’t tend to like the label, mainly because of the connotations, but the best description might be ‘functionally paranoid’. I’m the sort of person who reads the IT policy at work, notes that it allows searches of personal devices, and then never brings a data storage device of any sort to work as a result.)
Could you please elaborate?
https://eprint.iacr.org/2021/1064.pdf (“whoops, brightness is a sidechannel attack to recover audio because power supplies aren’t perfect”. Not typically directly applicable for a phone, but still interesting.)
https://github.com/ggerganov/kbd-audio/discussions/31 (“whoops, you can deduce keystrokes from audio recordings” (which probably means you can also do the same with written text...))
https://www.researchgate.net/publication/346954384_A_Taxonomy_of_WiFi_Sensing_CSI_vs_Passive_WiFi_Radar (“whoops, wifi can be used as a passive radar”)
https://www.hindawi.com/journals/scn/2017/7576307/ (“whoops, accelerometer data can often be used to determine location because people don’t travel along random routes”)
https://en.wikipedia.org/wiki/Words_per_minute#Handwriting (Actually, that’s copying speed, but close enough for a WAG.)
https://www.sciencedirect.com/science/article/pii/S0019995864903262 (May be different for this particular sort of sentence than typical English, but again, WAG.)
Or rather, modern data storage is huge, and modern data transmission is quick. It doesn’t take much correlation at all to inadvertently leak a bit or two per second when you’re transmitting millions (or billions) of bits a second. (And that’s assuming P(malice)=0…)
This sort of calculation is also why I don’t put too much weight in encryption schemes with longer-lived keys actually being secure. If you have a 512b key that you keep around for a month, you only need to leak a bit an hour to leak the key… and data storage is cheap enough that just because someone didn’t go through the work today to figure out how to recover the info, doesn’t mean they can’t pull it up a year or decade from now when they have better or more automated tooling. There’s a side question as to if anyone cares enough to do so… but another side question of if that question matters if the collection and tooling are automated.
Ah, I guess I was assuming that things in a are not in b, things in b are not in c, etc. although as-written pretty much anything in a later category would also be in all the earlier categories. Things you aren’t willing to say to anyone, you aren’t willing to say in public. Etc.
Example self-fulfilling prophecy: “I’m too depressed to be useful; I should just withdraw so as to not be a burden on anybody.” For some people it’s straightforwardly false, for others its straightforwardly true, but for some it’s true iff they think it is.