Tarot cards are great, as they allow you to bypass some of the biases that you’d have if you were to talk with ourself about that subject.
The self-protection instinct in circles like this is exaggerated. If something helps or works, why does it also have to be objectively correct? By placing too many limitations, you lock yourself out of advantages. We don’t need to subscribe entirely to anything, it’s fine to cherrypick advantages from all facets of life, and to overload yourself with multiple rulesets so that you can switch to the one which fits the context at hand the best.
Much of the “bullshit” is true from a subjective perspective like you say. They may “lie” to you because mystifying the process makes the placebo stronger, and because they enjoy life better when there’s a sense of mystery in it. In this text, I feel like there’s a tendency to evaluate external things objectively as good or bad (or correct or wrong), or a felt responsibility to do so. But this very tendency is what makes judgement difficult. If you limit yourself to what you can rationalize to others, then you don’t trust yourself enough. And if somebody can’t differentiate between valuable insights and bullshit, then they likely won’t recognize this post of yours as being more correct than the words of the next guru they meet.
I guess what I don’t like is how bigger and bigger portions of online content are disclaimers, ways of abdicating responsibility, warnings, and explanations aimed at people who are likely to be skeptical of you. LW posts (this one included) tend to contain more warnings about what to avoid than recommendations of what one should do, and rational people already limit themselves plently with rules as is, which is why they could benefit from these spiritual practices to begin with. But nonetheless, I don’t feel much more free to share my thoughts on LW than I do in Christian communities. Both immune systems will filter me if I don’t use enough buzzwords, or if I don’t apologize and seem humble enough.
That all said, I do agree with basically everything you’re written!
Hard to tell whether my “keeping at a distance” is a helpful contingency or a lingering baseless aversion. Maybe a bit of both. I also might have exaggerated a bit in order to signal group alignment—with the disclaimers being a kind of honey to make it an easier pill to swallow.
As in “a distance from irrationality”? I think that many rationalists go for general correctness, avoiding overfitting into specifics. But I think this merely means that they will never fit into any specific context perfectly well. With compartmentalization, or some sort of try-catch around hippie practices, I think it’s possible to have your cake and eat it too. I think that one can have more than one model of reality, and run experiments every now and then, reverting to the main branch with new knowledge after the experiment is over.
I think you’ve signaled group alignment, but I won’t deny that it feels necessary. My problem is with this necessity, or more exactly the underlying collective mentality which causes it.
Some people reject religion on the fact that Earth is more than 6000 years old, but this would be a poor critique of religion, since the main benefits of religion are different (defense against nihilism and the fear of death, as well as shared values and practices which defend against common pitfalls of human nature). Any proper criticism of religion oughts to be on a higher level than “Fossil records!”. But if you ask me, highly intelligent people are just as naive in their dismissal of spiritual practices. It means little that the explanation is bullshit when the people doing these practices experience improved mental health as a result.
Objectively speaking, if pure rationalism was the way to go, darwinism would have selected for it a little harder. Forgive me for ranting a bit!
I mostly agree, especially re shifting ontologies and the try-catch metaphor.
I agree religion provides meaning for many, but I don’t believe it’s necessary to combat nihilism. I don’t know if you intended to convey this, but in case someone is interested, I can heavily recommend the work of David Chapman, especially “meaningness”. It has helped me reorient in regard to nihilism.
Also, our current context is very different from the one we evolved in—Darwinian selection occurred in a different context and is (for a bunch of other reasons) not a good indicator of how to live a good life.
I do agree with your other points and like the direction you are pointing at—pragmatic metaphysics is one of my recent interests that has yet to make an appearance in my writing.
It’s not necessary to combat nihilism, I agree. It was just an example of a common shallow argument, which is often said with confidence despite correlating negatively with competence on the subject.
I personally think that meaninglessness is psychological rather than philosophical, and that it reveals a lack of engagement. In other words, you can feel like your life is meaningful independent of your belief about the objective meaning of life.
I agree that the context is different, but if you ask me, the psychological knowledge of LW is lacking. Highly intelligent people turn more logical, and it almost always results in them identifying with their own intelligence and forgetting that they’re animals. They neglect their needs, feeling like they’re above them, or like they’re too intelligent to have irrational needs. The result is bad mental health in intelligent people, and the world history of philosophy is basically just failed attempts at solving psychological problems through math and logic.
It takes very little to make a human happy, and fighting with oneself is certainly not the best way. Killing desires, killing ones ego, destroying ones biases, killing ones emotions. These are all religious, philosophical and rational methods of being a “more correct person”. Doesn’t this border on self-hatred and self-mutilation? I understand if this is self-sacrifice for scientific advancement, but people often try to solve this problem rationally, not realizing that excess rationality is the cause.
What if the idea that life is a problem to be solved is a symptom of bad mental health in itself? Just like a perfectionist belive that the solution to their problem is becoming more perfect, rather than getting rid of the perfectionism. Then excess rationalism would be a symptom rather than a solution, and effectively trap intelligent people in a life of unhappiness
Tarot cards are great, as they allow you to bypass some of the biases that you’d have if you were to talk with ourself about that subject.
The self-protection instinct in circles like this is exaggerated. If something helps or works, why does it also have to be objectively correct? By placing too many limitations, you lock yourself out of advantages.
We don’t need to subscribe entirely to anything, it’s fine to cherrypick advantages from all facets of life, and to overload yourself with multiple rulesets so that you can switch to the one which fits the context at hand the best.
Much of the “bullshit” is true from a subjective perspective like you say. They may “lie” to you because mystifying the process makes the placebo stronger, and because they enjoy life better when there’s a sense of mystery in it.
In this text, I feel like there’s a tendency to evaluate external things objectively as good or bad (or correct or wrong), or a felt responsibility to do so. But this very tendency is what makes judgement difficult. If you limit yourself to what you can rationalize to others, then you don’t trust yourself enough. And if somebody can’t differentiate between valuable insights and bullshit, then they likely won’t recognize this post of yours as being more correct than the words of the next guru they meet.
I guess what I don’t like is how bigger and bigger portions of online content are disclaimers, ways of abdicating responsibility, warnings, and explanations aimed at people who are likely to be skeptical of you.
LW posts (this one included) tend to contain more warnings about what to avoid than recommendations of what one should do, and rational people already limit themselves plently with rules as is, which is why they could benefit from these spiritual practices to begin with. But nonetheless, I don’t feel much more free to share my thoughts on LW than I do in Christian communities. Both immune systems will filter me if I don’t use enough buzzwords, or if I don’t apologize and seem humble enough.
That all said, I do agree with basically everything you’re written!
Hard to tell whether my “keeping at a distance” is a helpful contingency or a lingering baseless aversion. Maybe a bit of both. I also might have exaggerated a bit in order to signal group alignment—with the disclaimers being a kind of honey to make it an easier pill to swallow.
Thanks for your reflections.
As in “a distance from irrationality”? I think that many rationalists go for general correctness, avoiding overfitting into specifics. But I think this merely means that they will never fit into any specific context perfectly well. With compartmentalization, or some sort of try-catch around hippie practices, I think it’s possible to have your cake and eat it too. I think that one can have more than one model of reality, and run experiments every now and then, reverting to the main branch with new knowledge after the experiment is over.
I think you’ve signaled group alignment, but I won’t deny that it feels necessary. My problem is with this necessity, or more exactly the underlying collective mentality which causes it.
Some people reject religion on the fact that Earth is more than 6000 years old, but this would be a poor critique of religion, since the main benefits of religion are different (defense against nihilism and the fear of death, as well as shared values and practices which defend against common pitfalls of human nature). Any proper criticism of religion oughts to be on a higher level than “Fossil records!”.
But if you ask me, highly intelligent people are just as naive in their dismissal of spiritual practices. It means little that the explanation is bullshit when the people doing these practices experience improved mental health as a result.
Objectively speaking, if pure rationalism was the way to go, darwinism would have selected for it a little harder.
Forgive me for ranting a bit!
I mostly agree, especially re shifting ontologies and the try-catch metaphor.
I agree religion provides meaning for many, but I don’t believe it’s necessary to combat nihilism. I don’t know if you intended to convey this, but in case someone is interested, I can heavily recommend the work of David Chapman, especially “meaningness”. It has helped me reorient in regard to nihilism.
Also, our current context is very different from the one we evolved in—Darwinian selection occurred in a different context and is (for a bunch of other reasons) not a good indicator of how to live a good life.
I do agree with your other points and like the direction you are pointing at—pragmatic metaphysics is one of my recent interests that has yet to make an appearance in my writing.
It’s not necessary to combat nihilism, I agree. It was just an example of a common shallow argument, which is often said with confidence despite correlating negatively with competence on the subject.
I personally think that meaninglessness is psychological rather than philosophical, and that it reveals a lack of engagement. In other words, you can feel like your life is meaningful independent of your belief about the objective meaning of life.
I agree that the context is different, but if you ask me, the psychological knowledge of LW is lacking. Highly intelligent people turn more logical, and it almost always results in them identifying with their own intelligence and forgetting that they’re animals. They neglect their needs, feeling like they’re above them, or like they’re too intelligent to have irrational needs. The result is bad mental health in intelligent people, and the world history of philosophy is basically just failed attempts at solving psychological problems through math and logic.
It takes very little to make a human happy, and fighting with oneself is certainly not the best way. Killing desires, killing ones ego, destroying ones biases, killing ones emotions. These are all religious, philosophical and rational methods of being a “more correct person”. Doesn’t this border on self-hatred and self-mutilation? I understand if this is self-sacrifice for scientific advancement, but people often try to solve this problem rationally, not realizing that excess rationality is the cause.
What if the idea that life is a problem to be solved is a symptom of bad mental health in itself? Just like a perfectionist belive that the solution to their problem is becoming more perfect, rather than getting rid of the perfectionism. Then excess rationalism would be a symptom rather than a solution, and effectively trap intelligent people in a life of unhappiness