I don’t need another person to understand me fully. It’s perfectly fine that different people have different life philosophies. If you want to effect other people pushing something on them is usually not very effective.
I agree that one should not pin hopes to changing someone else’s life philosophy. The specific reason I’m interested in this is because there are people I want to talk to about what I’m thinking about now, but I realize that I can’t do that without talking about what I was thinking about three months ago, and I can’t talk about that without… But this gap is just going to increase unless I take deliberate steps to decrease it. (This is exactly what you describe later in your post.)
[edit]And, since this wasn’t as specific as it could have been, they don’t have to agree with my position; my first goal is them knowing my position well enough to make correct predictions about it. If they like it better they’ll move on their own.
I’m also realistic about the timeframes involved. I think it’s been about a decade since we were fully philosophically compatible, and taking another decade to close that gap seems like it might be necessary.
Giving up the need to have the other person understand you fully opens up a lot of freedom. You can start to listen to the other person. If you see a door that you can open to help a person make an insight you can go for it.
I’m reading this as “haste makes waste; if you learn the other person’s philosophy, you get credibility for listening and that knowledge lets you avoid the parts with the most resistance, target the parts that make the easiest jumping-off points for explaining your positions or are the most fertile places for your ideas to grow in. Once that’s established—and you’ll only know it’s established when you listen to them and hear that it’s taken root—then their philosophy will have shifted and there will be a new easiest spot.” Is that the main spirit of it or is there something I missed?
How easy it is for someone else to be able to make predictions about your actions depends on the way you make decisions.
If you make a decisions based on making a Fermi estimate and then using Bayes theorem a person who has no idea about Fermi estimates or Bayes theorem won’t be able to predict your actions. If you have integrated those concepts into your life so that you use them constantly than even a person who has learned Bayes theorem in an university lecture is unlikely to be able to follow. For them it’s just something abstract that’s used for text book problems and they will have a really hard time to predict your actions, because the can’t model you in a way that includes Bayes theorem.
If I make a decisions based on emotions my somatics teacher might be able to predict my actions better then myself because she has a better perception of my emotional state then I have. If I make my decisions based on an abstract intellectual concept like Bayes theorem then she can’t predict the outcomes. From her perspective I’m in my head and she has no information besides that.
Outside of specific primitives if you play a game two levels higher than the other person you are unpredictable.
Is that the main spirit of it or is there something I missed?
It goes beyond that. If you show a person who beliefs that vaccine causes autism a news article with lists scientific arguments that vaccines doesn’t cause autism, you can strengthen their beliefs that vaccines cause autism.
Mormons who go on a mission and debate Mormonism with outsiders get often more committed Mormons because they invest effort into defending their beliefs to outsiders.
We live in a world where people found in a study that people of higher intelligence are more likely to disbelieve in global warming.
In Go strategy there the term aji keshi. Defending a belief makes it more rigid.
If I try to open a door today and do badly at it, that door while be fortified in a month and it’s even harder to get in.
If you successfully planted a little carrot and then pull strongly on it before it’s ready you kill it.
I agree that one should not pin hopes to changing someone else’s life philosophy. The specific reason I’m interested in this is because there are people I want to talk to about what I’m thinking about now, but I realize that I can’t do that without talking about what I was thinking about three months ago, and I can’t talk about that without… But this gap is just going to increase unless I take deliberate steps to decrease it. (This is exactly what you describe later in your post.)
[edit]And, since this wasn’t as specific as it could have been, they don’t have to agree with my position; my first goal is them knowing my position well enough to make correct predictions about it. If they like it better they’ll move on their own.
I’m also realistic about the timeframes involved. I think it’s been about a decade since we were fully philosophically compatible, and taking another decade to close that gap seems like it might be necessary.
I’m reading this as “haste makes waste; if you learn the other person’s philosophy, you get credibility for listening and that knowledge lets you avoid the parts with the most resistance, target the parts that make the easiest jumping-off points for explaining your positions or are the most fertile places for your ideas to grow in. Once that’s established—and you’ll only know it’s established when you listen to them and hear that it’s taken root—then their philosophy will have shifted and there will be a new easiest spot.” Is that the main spirit of it or is there something I missed?
How easy it is for someone else to be able to make predictions about your actions depends on the way you make decisions.
If you make a decisions based on making a Fermi estimate and then using Bayes theorem a person who has no idea about Fermi estimates or Bayes theorem won’t be able to predict your actions. If you have integrated those concepts into your life so that you use them constantly than even a person who has learned Bayes theorem in an university lecture is unlikely to be able to follow. For them it’s just something abstract that’s used for text book problems and they will have a really hard time to predict your actions, because the can’t model you in a way that includes Bayes theorem.
If I make a decisions based on emotions my somatics teacher might be able to predict my actions better then myself because she has a better perception of my emotional state then I have. If I make my decisions based on an abstract intellectual concept like Bayes theorem then she can’t predict the outcomes. From her perspective I’m in my head and she has no information besides that.
Outside of specific primitives if you play a game two levels higher than the other person you are unpredictable.
It goes beyond that. If you show a person who beliefs that vaccine causes autism a news article with lists scientific arguments that vaccines doesn’t cause autism, you can strengthen their beliefs that vaccines cause autism.
Mormons who go on a mission and debate Mormonism with outsiders get often more committed Mormons because they invest effort into defending their beliefs to outsiders.
We live in a world where people found in a study that people of higher intelligence are more likely to disbelieve in global warming.
In Go strategy there the term aji keshi. Defending a belief makes it more rigid.
If I try to open a door today and do badly at it, that door while be fortified in a month and it’s even harder to get in.
If you successfully planted a little carrot and then pull strongly on it before it’s ready you kill it.