But these ways of looking at the world are not factually wrong, they’re just perverted in a sense. I agree that schools are quite terrible in general.
how could I have come up with this myself?
That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.
we need wisdom because people cannot think
In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike “nerdy” subjects, and it’s much more likely that they’d listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an “easy version” of knowledge available which requires 20 IQ points less than the hard version seems like a good idea. Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your “Things to unlearn from school” post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.
if you know enough rationality you can easily get past all that.
I don’t think “rationality” is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses). But we’re both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.
I’m not sure what you mean by “when you dissent when you have an overwhelming reason”. The article you linked to worded it “only when”, as if one should dissent more often, but it also warns against dissenting since it’s dangerous. By the way, I don’t like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I’ve experienced is also quite strong, which is strange since you’d suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.
I also don’t like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there’s no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they’re neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don’t act like I don’t realize what I’m doing.
It’s more optimal to be passionate about a field
I think this is mostly correct. But optimization can kill passion (since you’re just following the meta and not your own desires). And common wisdom says “Follow your dreams” which is sort of naive and sort of valid at the same time.
Believing false things purposefully is impossible
I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you’re between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm: Say “X is true because” and then allow your brain to search through your memoy for evidence. It will find them.
The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they’re just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don’t think the brain differentiates subjective and objective things, it doesn’t even know the difference.
And it doesn’t seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?
Ethically yes, epistemically no
You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say “Life is good” and a depression person might say “Life is cruel”, and they might even know the same facts.
Online “black pills” are dangerous, because the truth value of the knowledge doesn’t imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don’t need to refute it for its negative tone to be false.
Rationality is about having cognitive algorithms which have higher returns
But then it’s not about maximizing truth, virtue, or logic. If reality operates by different axioms than logic, then one should not be logical. The word “virtue” is overloaded, so people write like the word is related to morality, but it’s really just about thinking in ways which makes one more clear-sighted. So people who tell me to have “humility” are “correct” in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they’re better people than me (as if I’ve made an ethical/moral mistake in being stubborn or certain of myself). By truth, one means “reality” and not the concept “truth” as the result of a logic expression. This concept is overloaded too, so that it’s easy for people to manipulate a map with logical rules and then tell another person “You’re clearly not seeing the territory right”.
physics is more accurate than intuitive world models
Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, “There’s no one true map”. We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences. One of the limitation is “there’s no universal truth”, but this is not even a problem as the universe is finite. But “universal” in mathematics is assumed to be truly universal, covering all things, and it’s precisely this which is not possible. But we don’t notice, and thus come up with the illusion of uniqueness. And it’s this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it’s the consensus and not a consensus.
A good portion of hardcore rationalists tend to have something to protect, a humanistic cause
My bad for misrepresenting your position. Though I don’t agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an “improvement” of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it’s also a source of errors and unhappiness) what you’re left with is not human. It’s at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them. I just remembered seeing the quote “Rationality is winning”, and I’ll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that’s all. So you let other people be correct, and then you ask them for what you want, even if it’s completely unreasonable.
Rationality doesn’t necessarily have nature as a terminal value
I meant nature as its source (of evidence/truth/wisdom/knowledge). “Nature” meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what’s lacking in life and making those out to be virtue and the will of god.
None of that is incompatible with rationality
What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don’t seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There’s also an implicit bias towards alturism which cannot be derived from objective truth.
About my values, they already exist even if I’m not aware of them, they’re just unconscious until I make them conscious. But if system 1 functions well, then you don’t really need to train system 2 to function well, and it’s a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn’t come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.
The short version is that I’m not sold on rationality, and while I haven’t read 100% of the sequences it’s also not like my understanding is 0%. I’d have read more if they weren’t so long. And while an intelligent person can come up with intelligent ways of thinking, I’m not sure this is reversible. I’m also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.
Your last link needs an s before the dot.
Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there’s still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).
But these ways of looking at the world are not factually wrong, they’re just perverted in a sense.
I agree that schools are quite terrible in general.
That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.
In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike “nerdy” subjects, and it’s much more likely that they’d listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an “easy version” of knowledge available which requires 20 IQ points less than the hard version seems like a good idea.
Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your “Things to unlearn from school” post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.
I don’t think “rationality” is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses).
But we’re both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.
I’m not sure what you mean by “when you dissent when you have an overwhelming reason”. The article you linked to worded it “only when”, as if one should dissent more often, but it also warns against dissenting since it’s dangerous.
By the way, I don’t like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I’ve experienced is also quite strong, which is strange since you’d suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.
I also don’t like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there’s no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they’re neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don’t act like I don’t realize what I’m doing.
I think this is mostly correct. But optimization can kill passion (since you’re just following the meta and not your own desires). And common wisdom says “Follow your dreams” which is sort of naive and sort of valid at the same time.
I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you’re between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm:
Say “X is true because” and then allow your brain to search through your memoy for evidence. It will find them.
The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they’re just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don’t think the brain differentiates subjective and objective things, it doesn’t even know the difference.
And it doesn’t seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?
You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say “Life is good” and a depression person might say “Life is cruel”, and they might even know the same facts.
Online “black pills” are dangerous, because the truth value of the knowledge doesn’t imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don’t need to refute it for its negative tone to be false.
But then it’s not about maximizing truth, virtue, or logic.
If reality operates by different axioms than logic, then one should not be logical.
The word “virtue” is overloaded, so people write like the word is related to morality, but it’s really just about thinking in ways which makes one more clear-sighted. So people who tell me to have “humility” are “correct” in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they’re better people than me (as if I’ve made an ethical/moral mistake in being stubborn or certain of myself).
By truth, one means “reality” and not the concept “truth” as the result of a logic expression. This concept is overloaded too, so that it’s easy for people to manipulate a map with logical rules and then tell another person “You’re clearly not seeing the territory right”.
Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, “There’s no one true map”. We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences.
One of the limitation is “there’s no universal truth”, but this is not even a problem as the universe is finite. But “universal” in mathematics is assumed to be truly universal, covering all things, and it’s precisely this which is not possible. But we don’t notice, and thus come up with the illusion of uniqueness. And it’s this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it’s the consensus and not a consensus.
My bad for misrepresenting your position. Though I don’t agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an “improvement” of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it’s also a source of errors and unhappiness) what you’re left with is not human. It’s at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them.
I just remembered seeing the quote “Rationality is winning”, and I’ll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that’s all. So you let other people be correct, and then you ask them for what you want, even if it’s completely unreasonable.
I meant nature as its source (of evidence/truth/wisdom/knowledge). “Nature” meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what’s lacking in life and making those out to be virtue and the will of god.
What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don’t seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There’s also an implicit bias towards alturism which cannot be derived from objective truth.
About my values, they already exist even if I’m not aware of them, they’re just unconscious until I make them conscious. But if system 1 functions well, then you don’t really need to train system 2 to function well, and it’s a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn’t come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)
Honestly majority of the points presented here are not new and already been addressed in
https://www.lesswrong.com/rationality
or https://www.readthesequence.com/
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.
The short version is that I’m not sold on rationality, and while I haven’t read 100% of the sequences it’s also not like my understanding is 0%. I’d have read more if they weren’t so long. And while an intelligent person can come up with intelligent ways of thinking, I’m not sure this is reversible. I’m also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.
Your last link needs an s before the dot.
Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there’s still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).