Another issue with teaching it academically is that academic thought, like I already said, frames things in a mathematical and thus non-human way. And treating people like objects to be manipulated for certain goals (a common consequence of this way of thinking) is not only bad taste, it makes the game of life less enjoyable.
If you want something to be part of you, then you simply need to come up with it yourself, it will be your own knowledge. Learning other peoples knowledge however, feels to me like consuming something foreign.
Yes the trick for that is to delete the piece of knowledge you learnt and ask the question, how could I have come up with this myself?
Of course, my defense of ancient wisdom so far has simply been to translate it into an academic language in which it makes sense. “Be like water” is street-smarts, and “adaptability is a core component of growth/improvement/fitness” is the book-smarts. But the “street-smarts” version is easier to teach, and now that I think about it, that’s what the bible was for.
That just sounds to me like “we need wisdom because people cannot think” . Yes I would agree considering when you open reddit, twitter or any other platform you can find many biases being upvoted. I would agree memetic immune system is required for a person unaware of various background literature required to bootstrap rationality. I am not advocating for teaching anything I don’t have plans for being an activist or having will to change society. But consider this, if you know enough rationality you can easily get past all that.
I would agree on the latter part regarding good/evil. Unlike other rationalist this is why I don’t have will to change society. Internet has killed my societal moral compass for good/evil however you may like to put it for being more egoistic. Good just carries a positive system 1 connotation for me, I am just emoting it, but I mostly focus on my life. Or you have to be brutally honest about it, I don’t care about society as long as my interests are being fulfilled.
The actual truth value of beliefs have no psychological effects (proof: Otherwise we could use beliefs to measure the state of reality).
Agreed, map is not the territory, it feels same to be wrong as it feels to be right.
It’s more likely for somebody to become rich making music if their goal is simply to make music and enjoy themselves, than if their goal is to become rich making music.
Yes if someone isn’t passionate about such endeavours they may not have the will to sustain it. But if a person is totally apathetic to monetary concerns they’re not going to make it either. So a person may argue on a meta level it’s more optimal to be passionate about a field or choose a field you’re passionate about in which you want to do better , to overcome akrasia and there might be some selection bias at play where a person who’s good at something is likely to have positive feedback loop about the subject.
But the “Something to protect” link you sent seems to argue for this as well?
Yes, exactly, truth is in highest service to other goals if my phrasing of “highest instrumental value” wasn’t clear. But you don’t deliberately believe false things that’s what rationality is all about, truth is nice to have but usefulness is everything.
Believing false things purposefully is impossible either ways, you’re not anticipating it with high possibility. That’s not how rationalistbelief works. When you believe something that’s how reality is to you, you look at the world through your beliefs.
How many great peoples autobiographies and life stories have you read?
Not many, but it would be unrepresentative to generalise from that.
But it’s ultimately a projection, a worldview does not reveal the world, but rather than person with the worldview.
Ethically yes, epistemically no. Reality doesn’t care, this is what society gets wrong, if I am disagreeing with your climate denial or climate catastrophism I am not proposing a what needs to be done, there is a divide between morals and epistemics.
“I define rationality as what’s correct, so rationality can never be wrong, because that would mean you weren’t being rational”
Yes, finally you get my point. We label those things rationality, the things which work. Virtue of empiricism. Rationality is about having cognitive algorithms which have higher returns systematically on whatever is that thing you want.
maps of the territory are inherently limited (and I can prove this)
When you experience something your brain forms various models of it, and you look at the world through your beliefs.
You’re optimizing for “Optimization power over reality / a more reliable map”, while I’m optimizing for “Biological health, psychological well-being and enjoyment of existence”. And they do not seem to have as much in common as rationalists believe
That’s misrepresentation of my position I said truth is my highest instrumental value not highest terminal value. Besides good portion of hardcore rationalists tend to have something to protect, a humanistic cause, which they devote themselves to, that tends to be aligned with their terminal values however they may see fit. Others may solely focus on their own interests like health,life and wellbeing.
To reiterate, you only seek truth as much as it allows you to get what you want but you don’t believe in falsities. That’s it.
But if rationality in the end worships reality and nature, that’s quite interesting, because that puts it in the same boat as Taoism and myself. Some people even put Nature=God.
Rationality doesn’t necessarily have nature as a terminal value, rationality is a tool, the set of cognitive algorithms which work for whatever you want with truth being highest instrumental value. As you might have read in the something to protect article.
Finally, if my goal is being a good programmer, then a million factors will matter, including my mood, how much I sleep, how much I enjoy programming, and so on. But somebody who naively optimizes for progamming skills might practice at the cost of mood, sleep, and enjoyment, and thus ultimately end up with a mediocre result. So in this case, a heuristic like “Take care of your health and try to enjoy your life” might not lose out to a rat-race like mentality in performance. Meta-level knowledge might help here, but I still don’t think it’s enough. and the tendency to dismiss things which seem unlikely, illogical or silly is not as great as a heuristic as one would think, perhaps because any beliefs which manage to stay alive despite being silly have something special about them.
None of that is incompatible with rationality, rather rationality will help you get there. Heuristics like “take care of your health and try to enjoy life” seem more of vague plans to fulfill your complex set of values which one may discover more about. Values are complex and there are various posts you can find here which may help you model yourself better and reach reflective equilibrium which is the best you can do either ways both epistemically and morally (former (epistemics) of which is much more easily reached by focusing on getting better with w.r.t. your values than focusing solely on it as highlighted by the post since truth is only instrumental) .
But these ways of looking at the world are not factually wrong, they’re just perverted in a sense. I agree that schools are quite terrible in general.
how could I have come up with this myself?
That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.
we need wisdom because people cannot think
In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike “nerdy” subjects, and it’s much more likely that they’d listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an “easy version” of knowledge available which requires 20 IQ points less than the hard version seems like a good idea. Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your “Things to unlearn from school” post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.
if you know enough rationality you can easily get past all that.
I don’t think “rationality” is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses). But we’re both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.
I’m not sure what you mean by “when you dissent when you have an overwhelming reason”. The article you linked to worded it “only when”, as if one should dissent more often, but it also warns against dissenting since it’s dangerous. By the way, I don’t like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I’ve experienced is also quite strong, which is strange since you’d suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.
I also don’t like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there’s no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they’re neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don’t act like I don’t realize what I’m doing.
It’s more optimal to be passionate about a field
I think this is mostly correct. But optimization can kill passion (since you’re just following the meta and not your own desires). And common wisdom says “Follow your dreams” which is sort of naive and sort of valid at the same time.
Believing false things purposefully is impossible
I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you’re between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm: Say “X is true because” and then allow your brain to search through your memoy for evidence. It will find them.
The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they’re just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don’t think the brain differentiates subjective and objective things, it doesn’t even know the difference.
And it doesn’t seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?
Ethically yes, epistemically no
You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say “Life is good” and a depression person might say “Life is cruel”, and they might even know the same facts.
Online “black pills” are dangerous, because the truth value of the knowledge doesn’t imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don’t need to refute it for its negative tone to be false.
Rationality is about having cognitive algorithms which have higher returns
But then it’s not about maximizing truth, virtue, or logic. If reality operates by different axioms than logic, then one should not be logical. The word “virtue” is overloaded, so people write like the word is related to morality, but it’s really just about thinking in ways which makes one more clear-sighted. So people who tell me to have “humility” are “correct” in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they’re better people than me (as if I’ve made an ethical/moral mistake in being stubborn or certain of myself). By truth, one means “reality” and not the concept “truth” as the result of a logic expression. This concept is overloaded too, so that it’s easy for people to manipulate a map with logical rules and then tell another person “You’re clearly not seeing the territory right”.
physics is more accurate than intuitive world models
Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, “There’s no one true map”. We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences. One of the limitation is “there’s no universal truth”, but this is not even a problem as the universe is finite. But “universal” in mathematics is assumed to be truly universal, covering all things, and it’s precisely this which is not possible. But we don’t notice, and thus come up with the illusion of uniqueness. And it’s this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it’s the consensus and not a consensus.
A good portion of hardcore rationalists tend to have something to protect, a humanistic cause
My bad for misrepresenting your position. Though I don’t agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an “improvement” of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it’s also a source of errors and unhappiness) what you’re left with is not human. It’s at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them. I just remembered seeing the quote “Rationality is winning”, and I’ll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that’s all. So you let other people be correct, and then you ask them for what you want, even if it’s completely unreasonable.
Rationality doesn’t necessarily have nature as a terminal value
I meant nature as its source (of evidence/truth/wisdom/knowledge). “Nature” meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what’s lacking in life and making those out to be virtue and the will of god.
None of that is incompatible with rationality
What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don’t seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There’s also an implicit bias towards alturism which cannot be derived from objective truth.
About my values, they already exist even if I’m not aware of them, they’re just unconscious until I make them conscious. But if system 1 functions well, then you don’t really need to train system 2 to function well, and it’s a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn’t come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.
The short version is that I’m not sold on rationality, and while I haven’t read 100% of the sequences it’s also not like my understanding is 0%. I’d have read more if they weren’t so long. And while an intelligent person can come up with intelligent ways of thinking, I’m not sure this is reversible. I’m also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.
Your last link needs an s before the dot.
Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there’s still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).
Yes intuitions can be wrong welcome to reality. Beside I think schools are bad at teaching things.
Yes the trick for that is to delete the piece of knowledge you learnt and ask the question, how could I have come up with this myself?
That just sounds to me like “we need wisdom because people cannot think” . Yes I would agree considering when you open reddit, twitter or any other platform you can find many biases being upvoted. I would agree memetic immune system is required for a person unaware of various background literature required to bootstrap rationality. I am not advocating for teaching anything I don’t have plans for being an activist or having will to change society. But consider this, if you know enough rationality you can easily get past all that.
Sure a person should be aware when they’re drifting from the crowd and not become a contrarian since reversed stupidity is not intelligence and if you dissent when you have overwhelming reason for it you’re going to have enough problems in your life
I would agree on the latter part regarding good/evil. Unlike other rationalist this is why I don’t have will to change society. Internet has killed my societal moral compass for good/evil however you may like to put it for being more egoistic. Good just carries a positive system 1 connotation for me, I am just emoting it, but I mostly focus on my life. Or you have to be brutally honest about it, I don’t care about society as long as my interests are being fulfilled.
Agreed, map is not the territory, it feels same to be wrong as it feels to be right.
Yes if someone isn’t passionate about such endeavours they may not have the will to sustain it. But if a person is totally apathetic to monetary concerns they’re not going to make it either. So a person may argue on a meta level it’s more optimal to be passionate about a field or choose a field you’re passionate about in which you want to do better , to overcome akrasia and there might be some selection bias at play where a person who’s good at something is likely to have positive feedback loop about the subject.
Yes, exactly, truth is in highest service to other goals if my phrasing of “highest instrumental value” wasn’t clear. But you don’t deliberately believe false things that’s what rationality is all about, truth is nice to have but usefulness is everything.
Believing false things purposefully is impossible either ways, you’re not anticipating it with high possibility. That’s not how rationalist belief works. When you believe something that’s how reality is to you, you look at the world through your beliefs.
Not many, but it would be unrepresentative to generalise from that.
Ethically yes, epistemically no. Reality doesn’t care, this is what society gets wrong, if I am disagreeing with your climate denial or climate catastrophism I am not proposing a what needs to be done, there is a divide between morals and epistemics.
Yes, finally you get my point. We label those things rationality, the things which work. Virtue of empiricism. Rationality is about having cognitive algorithms which have higher returns systematically on whatever is that thing you want.
I would disagree, physics is more accurate than intuitive world models. The act of guessing a hypothesis is reverse engineering experience so to speak, you get a causal model which is connected to you in form of anticipations (this link is part of a sequence so there’s a chance there’s lot of background info).
When you experience something your brain forms various models of it, and you look at the world through your beliefs.
That’s misrepresentation of my position I said truth is my highest instrumental value not highest terminal value. Besides good portion of hardcore rationalists tend to have something to protect, a humanistic cause, which they devote themselves to, that tends to be aligned with their terminal values however they may see fit. Others may solely focus on their own interests like health,life and wellbeing.
To reiterate, you only seek truth as much as it allows you to get what you want but you don’t believe in falsities. That’s it.
Rationality doesn’t necessarily have nature as a terminal value, rationality is a tool, the set of cognitive algorithms which work for whatever you want with truth being highest instrumental value. As you might have read in the something to protect article.
Rationalists tend to have heavy respect for cognitive algorithms which allow us to systematically get us what we desire. They’re disturbed if there’s a violation in the process which gets us there.
None of that is incompatible with rationality, rather rationality will help you get there. Heuristics like “take care of your health and try to enjoy life” seem more of vague plans to fulfill your complex set of values which one may discover more about. Values are complex and there are various posts you can find here which may help you model yourself better and reach reflective equilibrium which is the best you can do either ways both epistemically and morally (former (epistemics) of which is much more easily reached by focusing on getting better with w.r.t. your values than focusing solely on it as highlighted by the post since truth is only instrumental) .
Edit: added some more links fixed some typos.
But these ways of looking at the world are not factually wrong, they’re just perverted in a sense.
I agree that schools are quite terrible in general.
That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.
In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike “nerdy” subjects, and it’s much more likely that they’d listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an “easy version” of knowledge available which requires 20 IQ points less than the hard version seems like a good idea.
Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your “Things to unlearn from school” post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.
I don’t think “rationality” is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses).
But we’re both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.
I’m not sure what you mean by “when you dissent when you have an overwhelming reason”. The article you linked to worded it “only when”, as if one should dissent more often, but it also warns against dissenting since it’s dangerous.
By the way, I don’t like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I’ve experienced is also quite strong, which is strange since you’d suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.
I also don’t like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there’s no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they’re neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don’t act like I don’t realize what I’m doing.
I think this is mostly correct. But optimization can kill passion (since you’re just following the meta and not your own desires). And common wisdom says “Follow your dreams” which is sort of naive and sort of valid at the same time.
I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you’re between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm:
Say “X is true because” and then allow your brain to search through your memoy for evidence. It will find them.
The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they’re just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don’t think the brain differentiates subjective and objective things, it doesn’t even know the difference.
And it doesn’t seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?
You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say “Life is good” and a depression person might say “Life is cruel”, and they might even know the same facts.
Online “black pills” are dangerous, because the truth value of the knowledge doesn’t imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don’t need to refute it for its negative tone to be false.
But then it’s not about maximizing truth, virtue, or logic.
If reality operates by different axioms than logic, then one should not be logical.
The word “virtue” is overloaded, so people write like the word is related to morality, but it’s really just about thinking in ways which makes one more clear-sighted. So people who tell me to have “humility” are “correct” in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they’re better people than me (as if I’ve made an ethical/moral mistake in being stubborn or certain of myself).
By truth, one means “reality” and not the concept “truth” as the result of a logic expression. This concept is overloaded too, so that it’s easy for people to manipulate a map with logical rules and then tell another person “You’re clearly not seeing the territory right”.
Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, “There’s no one true map”. We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences.
One of the limitation is “there’s no universal truth”, but this is not even a problem as the universe is finite. But “universal” in mathematics is assumed to be truly universal, covering all things, and it’s precisely this which is not possible. But we don’t notice, and thus come up with the illusion of uniqueness. And it’s this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it’s the consensus and not a consensus.
My bad for misrepresenting your position. Though I don’t agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an “improvement” of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it’s also a source of errors and unhappiness) what you’re left with is not human. It’s at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them.
I just remembered seeing the quote “Rationality is winning”, and I’ll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that’s all. So you let other people be correct, and then you ask them for what you want, even if it’s completely unreasonable.
I meant nature as its source (of evidence/truth/wisdom/knowledge). “Nature” meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what’s lacking in life and making those out to be virtue and the will of god.
What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don’t seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There’s also an implicit bias towards alturism which cannot be derived from objective truth.
About my values, they already exist even if I’m not aware of them, they’re just unconscious until I make them conscious. But if system 1 functions well, then you don’t really need to train system 2 to function well, and it’s a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn’t come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)
Honestly majority of the points presented here are not new and already been addressed in
https://www.lesswrong.com/rationality
or https://www.readthesequence.com/
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.
The short version is that I’m not sold on rationality, and while I haven’t read 100% of the sequences it’s also not like my understanding is 0%. I’d have read more if they weren’t so long. And while an intelligent person can come up with intelligent ways of thinking, I’m not sure this is reversible. I’m also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.
Your last link needs an s before the dot.
Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there’s still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).