Science hasn’t increased our social skills nor our understanding of ourselves, modern wisdom and life advice is not better than it was 2000 years ago.
Hard disagree, there’s an entire field of psychology, decision theory and ethics using reflective equilibrium in light of science.
Ancient wisdom can fail, but it’s quite trivial for me to find examples in which common sense can go terribly wrong. It’s hard to fool-proof anything, be it technology or wisdom.
Well some things go wrong more often than other things, wisdom goes wrong a lot of time, it isn’t immune to memetic selection, there is not much mechanism to prevent you from falling for false memes. Technology after one point goes wrong wayyy less. A biology textbook is much more likely to be accurate and better on medical advice than a ayurvedic textbook.
The whole “Be like water” thing is just flexibility/adaptability
Yes it’s a metaphor for adaptiveness, but I don’t understand where it may apply other than being a nice way to say “be more adaptive”. It’s like logical model like maths but for adaptiveness you import the idea of water-like adaptiveness into situations.
As for that which is not connected to reality much (wisdom which doesn’t seem to apply to reality), it’s mostly just the axioms of human cognition/nature.
You know what might be an axiom of human cognition? Bayes rule and other axioms in statistics. I have found that I can bypass a lot of wisdom by using these axioms where others are stuck without a proper model in real life due to ancient wisdom. Eg; I stopped taking ayurvedic medication which contained carcinogens; when people spend hours thinking about certain principles in ethics or decision theory I know the laws to prevent such confusion etc
If you’re in a good mood then the external world will seem better too. A related quote is “As you think, so you shall become” which is oddly simiar to the idea of cognitive behavioural therapy.
Honestly I agree with this part, I think this is the biggest weakness of rationalism. I think the failure to general purpose overcome akrasia is a failure of rationality. I find it hard to believe that there would be a person like david goggins but a rationalist. The obsession with accuracy doesn’t play well with romanticism of motivation and self-overcoming, it’s a battle you have to fight and figure out daily, and under the constraints of reality it becomes difficult.
There’s an entire field of psychology, yes, but most men are still confused by women saying “it’s fine” when they are clearly annoyed. Another thing is women dressing up because they want attention from specific men. Dressing up in a sexy manner is not a free ticket for any man to harass them, but socially inept men will say “they were asking for it” because the whole concept of selection and stardards doesn’t occur to them in that context. And have you read Niccolò Machiavelli’s “The Prince”? It predates psychology, but it is psychology, and it’s no worse than modern books on office politics and such, as far as I can tell. Some things just aren’t improving over time.
wisdom goes wrong a lot of time
You gave the example of the ayurvedic textbook, but I’m not sure I’d call that “wisdom”. If we compare ancient medicine to modern medicine, then modern medicine wins in like 95% of cases. But for things relating to humanity itself, I think that ancient literature comes out ahead. Modern hard sciences like mathematics are too inhuman (autistic people are worse at socializing because they’re more logical and objective). And modern soft sciences are frankly pathetic quite often (Gardner’s Theory of Multiple Intelligences is nothing but a psychological defense against the idea that some people aren’t very bright. Whoever doesn’t realize this should not be in charge of helping other people with psychological issues)
I don’t understand where it may apply other than being a nice way to say “be more adaptive”
It’s a core concept which applies to all areas of life. Humans won against other species because we were better at adapting. Nietzsche wrote “The snake which cannot cast its skin has to die. As well the minds which are prevented from changing their opinions; they cease to be mind”. This community speaks a lot of “updating beliefs” and “intellectual humility” because thinking that one has all the answers, and not updating ones beliefs over time, leads to cognitive inflexibility/stagnation, which prevents learning. Principles are incredibly powerful, and most human knowledge probably boils down to about 200 or 300 core principles.
I have found that I can bypass a lot of wisdom by using these axioms
Would I be right to guess that ancient wisdom fails you the most in objective areas of life, and that it hasn’t failed you much in the social parts? I don’t disagree that modern axioms can be useful, but I think there’s many areas where “intelligent” approaches leads to worse outcomes. For the most part, attempting to control things lead to failure. I’ve had more unpleasant experiences on heavily moderated platforms than I have had in completely unmoderated spaces. I think it’s because self-organization can take place once disturbance from the outside ceases. But we will likely never know.
I think the failure to general purpose overcome akrasia is a failure of rationality
You could put it like that. I’d say something like “The rules of the brain are different than those of math, if you treat the brain like it’s supposed to be rational, you will always find it to be malfunctioning for reasons that you don’t understand”. Too many geniuses have failed at living good lives for me to believe that intelligence is enough. I have friends with IQs above 145 who are depressed because they think too rationally to understand their own nature. They reject the things which could help them, because they look down on them as subjective/silly/irrational. David Goggings story is pretty interesting. I can’t say I went through as much as him, but we do have things in common. This might be why I have the courage to criticize science on LW in the first place.
There’s an entire field of psychology, yes, but most men are still confused by women saying “it’s fine” when they are clearly annoyed. Another thing is women dressing up because they want attention from specific men. Dressing up in a sexy manner is not a free ticket for any man to harass them, but socially inept men will say “they were asking for it” because the whole concept of selection and stardards doesn’t occur to them in that context. And have you read Niccolò Machiavelli’s “The Prince”? It predates psychology, but it is psychology, and it’s no worse than modern books on office politics and such, as far as I can tell. Some things just aren’t improving over time.
I think majority of people aren’t aware of psychology and various fields under it. Ethics and decision theory kind of give a lot of clarity into such decisions when you analyse the payoff matrix. I haven’t The prince but have read excerpts from it in self-improvement related diaspora, I am not denying the value which such literature gives us, I just think we should move on by learning from it and developing on top in light of newer methods.
Agreed. Some fields under psychology as pathetic. But the fields like cognitive biases etc are not.
and that it hasn’t failed you much in the social parts?
Well astrology has clearly failed me, my mom often had these luddite-adjacent ideas about what I am meant to do in life because her entire source of ethics was astrology. Astrology in career advice is like rolling a dice and assigning the all the well known professions to a number rather than actual life satisfaction or value fulfillment.
“The rules of the brain are different than those of math, if you treat the brain like it’s supposed to be rational, you will always find it to be malfunctioning for reasons that you don’t understand”
I would strongly disagree on the front of intelligence . More Rational as in cognitive algorithms which tend to lead to systematic optimality in this case truth seeking/achieving goals is indeed possible and pretty much is a part of growth.
I would weakly disagree on the front of Internal family subsystems (with the internal double crux special case being extremely useful) and other introspective reductionist methods where you breakdownyour emotional responses and process into parts and understand what you like/dislike and the various attempts to bridge the two. On this front there are plethora of competing theories due to easy problem of consciousness and trying to understand experience functionally.
And for brain not working as I want to be when I model other parts of this brain, I find it being emotionally engaged in things which aren’t optimal for some of my goals and it isn’t contradictory with rationality to acknowledge or deal with these feelings.
I was praising goggins because he’s more of the type who is willing to fight himself and in more than half of the introspective models that without acknowledgement is bordering on self-harm. I find his strategy to be intuitively much better lol.
Where I would agree is that if you don’t understand something then your theory is probably wrong. There are not confusing facts only models which are confused by facts.
Too many geniuses have failed at living good lives for me to believe that intelligence is enough.
I think growth is important, I like to think of it in intelligence being compute power and growth and learning being more of changing algorithms. Besides there is a good amount of coorelations with IQ you might want to look into, I think this area is very contentious (got a system1 response to check for the social norms due to past bans lol) , but we’re on lesswrong, so you can continue.
This might be why I have the courage to criticize science on LW in the first place.
You’re welcome, maybe you should read sequence highlights to get introduced with LW’s POV to understand other people’s positions here.
I think majority of people aren’t aware of psychology and various fields under it
I don’t think there’s a reason for most people to learn psychology or game theory, as you can teach basic human behaviour and such without the academic perspective. I even think it’s a danger to be more “book smart” than “street smart” about social things. So rather than teaching game theory in college, schools could make children read and write a book report on “How to Win Friends & Influence People” in 4th grade or whatever. Academic knowledge which doesn’t make it to 99% of the population doesn’t help ordinary people much. But a lot of this knowledge is simple and easier than the math homework children tend to struggle with.
I don’t particularly believe in morality myself, and I also came to the conclusion that having shared beliefs and values is really useful, even if it means that a large group of people are stuck in a local maximum. As a result of this, I’m against people forcing their “moral” beliefs on foreign groups, especially when these groups are content and functional already. So I reject any global consensus of what’s “good”. No language is more correct than another language, and the same applies for cultures and such.
Well it depends on your definition of inhuman
It’s funny that you should link that post, since it introduces an idea that I already came up with myself. What I meant was that people tend to value what’s objective over what’s subjective, so that their rational thinking becomes self-destructive or self-denying in a sense. Rationality helps us to overcome our biases, but thinking of rationality as perfect and of ourselves as defect is not exactly healthy. A lot of people who think they’re “super-humans” are closer to being “half-humans”, since what they’re doing is closer to destroying their humanity than overcoming or going beyond it. And I’m saying this despite the fact that some of these people are better at climbing social hierarchies or getting rich than me. In short, the objective should serve the subjective, not the other way around. “The lenses which sees its own flaws” merely conditions itself to seeing flaws in everything. Some of my friends are artists, and they hate their own work because they’re good at spotting imperfections in it, I don’t consider this level of optimization to be any good for me. When I’m rational, it’s because it’s useful for me, so I’m not going to harm myself in order to become more rational. That’s like wanting money thinking it will make me happy, and then sacrificing my happiness in order to make money.
But the fields like cognitive biases etc are not
I’ll agree as long as these fields haven’t been subverted by ideologies or psychological copes against reality yet (as that’s what tend to make soft sciences pathetic). The “Tall poppy syndrome” has warped the publics perception of the “Dunning kruger effect”, so that it becomes an insult you can use against anyone you disagree with who are certain of themselves, especially in a social sitaution in which a majority disagree.
Astrology
Astrology is wrong and unscientific, but I can see why it would originate. It’s a kind of pattern reocgnition gone awry. Since everything is related, and the brain is sometimes lazy and thinks that correlation=causation and that X implies Y is the same as Y implies X, they use patterns to predict things, and assume that recreating the patterns will recreate the things. This is mostly wrong, of course, but not always. People who are happy are likely to smile, but smiling actually tends to make you happier as well. Do you know the tragic story behind the person who invented handwashing? He found the right pattern, and the results were verifiable, but because his idea sounded silly, he ended up suffering.
If you had used astrology yourself, it might have ended better, as you’d be likely to intrepret what you wanted to to be true, and since your belief that your goal in life was fated to come true would help against the periodic doubt that people face in life.
I would strongly disagree on the front of intelligence
Intelligent is not something you are, it’s something you have. Identifying with your intelligence is how you disown 90% of yourself. Seeing intelligence as something available to you rather than as something you are helps eliminate internal conflict. All “gifted kid burnout” and “depressed intelligent person” situation I have seen was partly caused by this dangerous identification. Even if you dismiss everything else I’ve said so far, I want to stress the importance of this one thing. Lastly, “systematic optimality” seems to suffer from something like Goodhart’s law. When you optimize for one variable, you may harm 100 other variables slightly without realizing it (paperclip optimizers seem like the mathematical limit of this idea). Holistic perspectives tend to go wrong less often.
I like the Internal Family Systems view. I think the brain has competing impulses whose strength depends on your physical and psychological needs. but while I think your brain is rational according to what it wants, I don’t think it’s rational according to what you want. In fact, I think peoples brains tend to toy them them completely. It creates suffering to motivate you, it creates anxiety to get you to defend yourself, it creates displeasure and tells you that you will be happy if you achieve your goals. Being happy all the time is easy, but our brain makes this hard to realize so that we don’t hack our own reward systems and die. If you only care about a few goals, your worldview is extremely simple. You have a complex life with millions of factors, but you only care about a few objective metrics? I’m personally glad that people who chase money or fame above all end up feeling empty, for you might as well just replace humanity with robots if you care so little for experiencing what life has to offer.
there is a good amount of coorelations with IQ
Oh, I know, I have a few bans from various websites myself (and I once got rate limited on here). And intelligence correlates with nihilism, meta-thinking, systemization, and anxiety (I know a study found the correlaton to mental illness to be false. But I think the correlation is negative until about 120 IQ and then positive after). But why did Nikola Tesla’s intelligence not prevent him from dying poor and lonely? Why was Einstein so awkward? Why do some many intelligent people not enjoy life very much? My answer is that these are consequences of lacking humanity / healthy ways of thinking. It’s not just that stupid people are delusional. I personally like the idea that intelligence comes at the cost of instinct. For reference, I used to think rationally, I hated the world, I hated people, I couldn’t make friends, I couldn’t understand myself. Now I’m completely fine, I even overcame depression. I don’t suffer and I don’t even dislike suffering, I love life, I like socializing. I don’t worry about injustice, immorality or death.
I just found a highlight of the sequences, and it turns out that I have read most of the posts already, or just discovered the principles myself previously. And I disagree with a few of the moral rules because they decrease my performance in life by making me help society. Finally, my value system is what I like, not what is mathematically optimal for some metric which people think could help society experience less negative emotions (I don’t even think this is true or desirable)
I even think it’s a danger to be more “book smart” than “street smart” about social things.
Honestly I don’t know enough about people to actually tell if that’s really the case for me book smart becomes street smart when I make it truly a part of me.
That’s how I live anyways. For me when you formalise streetsmart it becomes booksmarts to other people, and the latter is likely to yield better prediction aside from the places where you lack compute like in case of society where most of people don’t use their brain outside of social/consensus reality. So maybe you’re actually onto something here, along the lines of “don’t tell them the truth because they cannot handle it” lol.
Astrology is wrong and unscientific, but I can see why it would originate. It’s a kind of pattern reocgnition gone awry.
Well since I wanted to dismantle the chesterton fence I did reach the similar conclusions as yours regarding why it came to be and why they (the ancients) fell for it, the correlation causation one is the general purpose one. One major reason was agriculture where it was likely to work well due to common cause of seasons and relative star movement. So it can also be thought of as faulty generalisation.
If you had used astrology yourself, it might have ended better, as you’d be likely to intrepret what you wanted to to be true, and since your belief that your goal in life was fated to come true would help against the periodic doubt that people face in life.
That’s false I wouldn’t have socially demotivated my mom using apathy from wasting too much money on astrology, if I had been enthusiastic about it it would have fueled into her desire. Astrology is like that false hope of lottery, waste of emotional energies.
I would have been likely to fall for other delusions surrounding astrology instead of spending that time learning about things for example going on to pilgrimage for few weeks before exams etc .
Besides astrology predicts everything on the list of usual human behavior and more or less ends up predicting nothing.
Lastly, “systematic optimality” seems to suffer from something like Goodhart’s law. When you optimize for one variable, you may harm 100 other variables slightly without realizing it (paperclip optimizers seem like the mathematical limit of this idea). Holistic perspectives tend to go wrong less often.
Well more or less rational is w.r.t. to cognitive algorithms, you tend to have one variable, achieving goals. And cognitive algorithms which are better at reaching certain goals are more rational w.r.t. to that goal.
There is a distinction made better truth oriented epistemic rationality and day-to-day life goal oriented instrumental rationality but for me they’re pretty similar that for epistemic the goal is truth.
I think the distinction was made because there’s significant amount of epistemics in rationality.
If your goal is optimising 100 variables then go with it. For a rationalist truth tends to be their highest instrumental value, that’s the main difference imo between a rationalist or say a post-rationalist or a pre-rationalist. They can have other terminal values above that like life,liberty and pursuit of happiness etc.
I’m personally glad that people who chase money or fame above all end up feeling empty, for you might as well just replace humanity with robots if you care so little for experiencing what life has to offer.
I think it again depends on value being 2 place function. Some people may find fulfillment from that. I have met some of them who’re like that. I think quite a bit of literature on the topic is a bit biased in favour of common morality.
But why did Nikola Tesla’s intelligence not prevent him from dying poor and lonely? Why was Einstein so awkward? Why do some many intelligent people not enjoy life very much?
I think you would need to provide evidence for such claims, my prior is set against such claims given the low amount of evidence I have encountered and I cannot update it just because some cultural wisdom said so, because cultural wisdom is often wrong.
For reference, I used to think rationally, I hated the world, I hated people, I couldn’t make friends, I couldn’t understand myself.
Then you weren’t thinking rationally. To quote;
If you say, “It’s (epistemically) rational for me to believe X, but the truth is Y,” then you are probably using the word “rational” to mean something other than what I have in mind. (E.g., “rationality” should be consistent under reflection—“rationally” looking at the evidence, and “rationally” considering how your mind processes the evidence, shouldn’t lead to two different conclusions.)
Similarly, if you find yourself saying, “The (instrumentally) rational thing for me to do is X, but the right thing for me to do is Y,” then you are almost certainly using some other meaning for the word “rational” or the word “right.” I use the term “rationality” normatively, to pick out desirable patterns of thought.
You may even learn something about rationality from the experience, if you are already far enough grown in your Art to say, “I must have had the wrong conception of rationality,” and not, “Look at how rationality gave me the wrong answer!
And I disagree with a few of the moral rules because they decrease my performance in life by making me help society. Finally, my value system is what I like, not what is mathematically optimal for some metric which people think could help society experience less negative emotions (I don’t even think this is true or desirable)
Well then you can be mathematically optimal for the other metric. Laws of decision theory don’t stop working if you change your utility function. Unless you want to get money pumped lol , in that case your preferences are circular. Yes you might argue that we’re not knowledgeable enough to figure out what our values will be in various subject areas, and there’s a reason we have an entire field of AI alignment due to various such issues, and there are various problems with inferring our desires, limits of introspection.
Another issue with teaching it academically is that academic thought, like I already said, frames things in a mathematical and thus non-human way. And treating people like objects to be manipulated for certain goals (a common consequence of this way of thinking) is not only bad taste, it makes the game of life less enjoyable. Learning how to program has harmed my immersion in games, and I have a tendency to powergame, which makes me learn new videogames way faster than other people, also with the result that I’m having less fun than them. I think rationality can result in the same thing. Why do people dislike “sellouts” and “cars salesmen” if not for the fact they they simply optimize for gains in a way which conflicts with taste? But if we all just treat taste like it’s important, or refuse to collect so much information that we can see the optimal routes, then Moloch won’t be able to hurt us.
If you want something to be part of you, then you simply need to come up with it yourself, it will be your own knowledge. Learning other peoples knowledge however, feels to me like consuming something foreign.
Of course, my defense of ancient wisdom so far has simply been to translate it into an academic language in which it makes sense. “Be like water” is street-smarts, and “adaptability is a core component of growth/improvement/fitness” is the book-smarts. But the “street-smarts” version is easier to teach, and now that I think about it, that’s what the bible was for.
Most things that society waste its time discussing are wrong. And they’re wrong in the sense than even an 8-year-old should be able to see that all controversies going on right now are frankly nonsense. But even academics cannot seem to frame things in a way that isn’t riddled with contradictions and hypocrisy. Does “We are good, but some people are evil, and we need to fight evil with evil otherwise the evil people will win by being evil while we’re being good” not sound silly? A single thought will get you karl poppers “paradox of tolerance” and a single thought more will make you realize that it’s not a paradox but a kind of neutrality/reflexivity which make both sides equal, and that “We need to fight evil” means “We want our brand of evil to win” as long as people don’t dislike evil itself but rather how it’s used. Again, this is not more complicated than “I punched my little brother because I was afraid he’d punch me first, and punching is bad” which I expect most children to see the problem with.
astrology
The thought experiment I had in mind was limited to a single isolated situation, you took it much further, haha. My point was simply “If you use astrology for yourself, the outcomes are usually alright”. Same with tarot cards, as far as I’m concerned, it’s a way to talk with your subconsciousness without your ego getting in the way, which requires acting as if something else is present. Even crystal balls are probably a kind of Rorschach test, and should not be used to “read other people” for this reason. Finally, I don’t disagree with the low utility of astrology, but false hope gives people the same reassurance as real hope. People don’t suffer from the non-existence of god, but from the doubt of his existence. The actual truth value of beliefs have no psychological effects (proof: Otherwise we could use beliefs to measure the state of reality).
are more rational w.r.t. to that goal
I disagree as I know of counter-examples. It’s more likely for somebody to become rich making music if their goal is simply to make music and enjoy themselves, than if their goal is to become rich making music. You see similar effects for people who try to get girlfriends, or happiness for that matter. If X resuls in Y, then you should optimize for X and not for Y. Many companies are dying because they don’t realize such a simple thing (they try to exploit something pre-existing rather than making more of what they’re exploiting, for instance the trust in previous IPs). Ancient wisdom tackles this. Wu Wei is about doing the right by not trying to do it. I don’t know how often this works, but it sometimes does.
I have to disagree that anyones goal is truth. I’ve seen strong evidence that knowledge of an environment is optimal for survival, and that knowledge-optimizing beats self-delusion every time, but even in this case, the real goal is “survival” and not “truth”. And my proof is the following: If you optimize for truth because it feels correct or because you believe it’s what’s best, then your core motivation is feelings or beliefs respectively. For similar reasons, non-egoism is trivially impossible. But the “Something to protect” link you sent seems to argue for this as well? And truth is not always optimal for goals. The belief that you’re justified and the belief that you can do something are both helpful. The average person is 5⁄10 but tend to rate themself as 7⁄10, which may the around the optimal bias. By the way, most of my disagreements so far seem to be “Well, that makes sense logically, but if you throw human nature into the equation then it’s wrong”
Some people may find fulfillment from that
I find myself a little doubtful here. People usually chase fame not because they value it, but because other people seem to value it. They might even agree cognitively on what’s valuable, but it’s no use if they don’t feel it.
I think you would need to provide evidence for such claims
How many great peoples autobiographies and life stories have you read? The nearer you get to them, the more human they seem, and if you get too close you may even find yourself crushed by pity. About Isaac Newton, it was even said “As a man he was a failure; as a monster he was superb”. Boltzmann committed suicide, John Nash suffered from skizophrenia. Philosophy is even worse off, titles like “suicide or coffee?” do not come from healthy states of mind. And have you read the Vasistha Yoga? It’s basically poison. But it’s ultimately a projection, a worldview does not reveal the world, but rather than person with the worldview.
Then you weren’t thinking rationally
But what saved me was not changing my knowledge, but my interpretation of it. I was right that people lie a lot, but I thought it was for their own sake, when it’s mostly out of consideration for others. I was right that people were irrational, but I didn’t realize that this could be a good thing.
No one can exempt you from laws of rationality
That seems like it’s saying “I define rationality as what’s correct, so rationality can never be wrong, because that would mean you weren’t being rational”. By treating rationality as something which is discovered rather than created (by creating a map and calling it the territory), any flaw can be justified as “that wasn’t real rationality, we just didn’t act completely rationally because we’re flawed human beings! (our map was simply wrong!)”. There can be no universal knowledge, maps of the territory are inherently limited (and I can prove this). As far as rationality uses math and verbal or written communication, it can only approximate something which cannot be put into words “The dao of which can be spoken is not the dao” simply means “the map is not the territory”.
By the way, I think I’ve found a big difference between our views. You’re (as far as I can tell) optimizing for “Optimization power over reality / a more reliable map”, while I’m optimizing for “Biological health, psychological well-being and enjoyment of existence”. And they do not seem to have as much in common as rationalists believe.
But if rationality in the end worships reality and nature, that’s quite interesting, because that puts it in the same boat as Taoism and myself. Some people even put Nature=God.
Finally, if my goal is being a good programmer, then a million factors will matter, including my mood, how much I sleep, how much I enjoy programming, and so on. But somebody who naively optimizes for progamming skills might practice at the cost of mood, sleep, and enjoyment, and thus ultimately end up with a mediocre result. So in this case, a heuristic like “Take care of your health and try to enjoy your life” might not lose out to a rat-race like mentality in performance. Meta-level knowledge might help here, but I still don’t think it’s enough. and the tendency to dismiss things which seem unlikely, illogical or silly is not as great as a heuristic as one would think, perhaps because any beliefs which manage to stay alive despite being silly have something special about them.
Another issue with teaching it academically is that academic thought, like I already said, frames things in a mathematical and thus non-human way. And treating people like objects to be manipulated for certain goals (a common consequence of this way of thinking) is not only bad taste, it makes the game of life less enjoyable.
If you want something to be part of you, then you simply need to come up with it yourself, it will be your own knowledge. Learning other peoples knowledge however, feels to me like consuming something foreign.
Yes the trick for that is to delete the piece of knowledge you learnt and ask the question, how could I have come up with this myself?
Of course, my defense of ancient wisdom so far has simply been to translate it into an academic language in which it makes sense. “Be like water” is street-smarts, and “adaptability is a core component of growth/improvement/fitness” is the book-smarts. But the “street-smarts” version is easier to teach, and now that I think about it, that’s what the bible was for.
That just sounds to me like “we need wisdom because people cannot think” . Yes I would agree considering when you open reddit, twitter or any other platform you can find many biases being upvoted. I would agree memetic immune system is required for a person unaware of various background literature required to bootstrap rationality. I am not advocating for teaching anything I don’t have plans for being an activist or having will to change society. But consider this, if you know enough rationality you can easily get past all that.
I would agree on the latter part regarding good/evil. Unlike other rationalist this is why I don’t have will to change society. Internet has killed my societal moral compass for good/evil however you may like to put it for being more egoistic. Good just carries a positive system 1 connotation for me, I am just emoting it, but I mostly focus on my life. Or you have to be brutally honest about it, I don’t care about society as long as my interests are being fulfilled.
The actual truth value of beliefs have no psychological effects (proof: Otherwise we could use beliefs to measure the state of reality).
Agreed, map is not the territory, it feels same to be wrong as it feels to be right.
It’s more likely for somebody to become rich making music if their goal is simply to make music and enjoy themselves, than if their goal is to become rich making music.
Yes if someone isn’t passionate about such endeavours they may not have the will to sustain it. But if a person is totally apathetic to monetary concerns they’re not going to make it either. So a person may argue on a meta level it’s more optimal to be passionate about a field or choose a field you’re passionate about in which you want to do better , to overcome akrasia and there might be some selection bias at play where a person who’s good at something is likely to have positive feedback loop about the subject.
But the “Something to protect” link you sent seems to argue for this as well?
Yes, exactly, truth is in highest service to other goals if my phrasing of “highest instrumental value” wasn’t clear. But you don’t deliberately believe false things that’s what rationality is all about, truth is nice to have but usefulness is everything.
Believing false things purposefully is impossible either ways, you’re not anticipating it with high possibility. That’s not how rationalistbelief works. When you believe something that’s how reality is to you, you look at the world through your beliefs.
How many great peoples autobiographies and life stories have you read?
Not many, but it would be unrepresentative to generalise from that.
But it’s ultimately a projection, a worldview does not reveal the world, but rather than person with the worldview.
Ethically yes, epistemically no. Reality doesn’t care, this is what society gets wrong, if I am disagreeing with your climate denial or climate catastrophism I am not proposing a what needs to be done, there is a divide between morals and epistemics.
“I define rationality as what’s correct, so rationality can never be wrong, because that would mean you weren’t being rational”
Yes, finally you get my point. We label those things rationality, the things which work. Virtue of empiricism. Rationality is about having cognitive algorithms which have higher returns systematically on whatever is that thing you want.
maps of the territory are inherently limited (and I can prove this)
When you experience something your brain forms various models of it, and you look at the world through your beliefs.
You’re optimizing for “Optimization power over reality / a more reliable map”, while I’m optimizing for “Biological health, psychological well-being and enjoyment of existence”. And they do not seem to have as much in common as rationalists believe
That’s misrepresentation of my position I said truth is my highest instrumental value not highest terminal value. Besides good portion of hardcore rationalists tend to have something to protect, a humanistic cause, which they devote themselves to, that tends to be aligned with their terminal values however they may see fit. Others may solely focus on their own interests like health,life and wellbeing.
To reiterate, you only seek truth as much as it allows you to get what you want but you don’t believe in falsities. That’s it.
But if rationality in the end worships reality and nature, that’s quite interesting, because that puts it in the same boat as Taoism and myself. Some people even put Nature=God.
Rationality doesn’t necessarily have nature as a terminal value, rationality is a tool, the set of cognitive algorithms which work for whatever you want with truth being highest instrumental value. As you might have read in the something to protect article.
Finally, if my goal is being a good programmer, then a million factors will matter, including my mood, how much I sleep, how much I enjoy programming, and so on. But somebody who naively optimizes for progamming skills might practice at the cost of mood, sleep, and enjoyment, and thus ultimately end up with a mediocre result. So in this case, a heuristic like “Take care of your health and try to enjoy your life” might not lose out to a rat-race like mentality in performance. Meta-level knowledge might help here, but I still don’t think it’s enough. and the tendency to dismiss things which seem unlikely, illogical or silly is not as great as a heuristic as one would think, perhaps because any beliefs which manage to stay alive despite being silly have something special about them.
None of that is incompatible with rationality, rather rationality will help you get there. Heuristics like “take care of your health and try to enjoy life” seem more of vague plans to fulfill your complex set of values which one may discover more about. Values are complex and there are various posts you can find here which may help you model yourself better and reach reflective equilibrium which is the best you can do either ways both epistemically and morally (former (epistemics) of which is much more easily reached by focusing on getting better with w.r.t. your values than focusing solely on it as highlighted by the post since truth is only instrumental) .
But these ways of looking at the world are not factually wrong, they’re just perverted in a sense. I agree that schools are quite terrible in general.
how could I have come up with this myself?
That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.
we need wisdom because people cannot think
In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike “nerdy” subjects, and it’s much more likely that they’d listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an “easy version” of knowledge available which requires 20 IQ points less than the hard version seems like a good idea. Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your “Things to unlearn from school” post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.
if you know enough rationality you can easily get past all that.
I don’t think “rationality” is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses). But we’re both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.
I’m not sure what you mean by “when you dissent when you have an overwhelming reason”. The article you linked to worded it “only when”, as if one should dissent more often, but it also warns against dissenting since it’s dangerous. By the way, I don’t like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I’ve experienced is also quite strong, which is strange since you’d suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.
I also don’t like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there’s no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they’re neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don’t act like I don’t realize what I’m doing.
It’s more optimal to be passionate about a field
I think this is mostly correct. But optimization can kill passion (since you’re just following the meta and not your own desires). And common wisdom says “Follow your dreams” which is sort of naive and sort of valid at the same time.
Believing false things purposefully is impossible
I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you’re between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm: Say “X is true because” and then allow your brain to search through your memoy for evidence. It will find them.
The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they’re just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don’t think the brain differentiates subjective and objective things, it doesn’t even know the difference.
And it doesn’t seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?
Ethically yes, epistemically no
You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say “Life is good” and a depression person might say “Life is cruel”, and they might even know the same facts.
Online “black pills” are dangerous, because the truth value of the knowledge doesn’t imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don’t need to refute it for its negative tone to be false.
Rationality is about having cognitive algorithms which have higher returns
But then it’s not about maximizing truth, virtue, or logic. If reality operates by different axioms than logic, then one should not be logical. The word “virtue” is overloaded, so people write like the word is related to morality, but it’s really just about thinking in ways which makes one more clear-sighted. So people who tell me to have “humility” are “correct” in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they’re better people than me (as if I’ve made an ethical/moral mistake in being stubborn or certain of myself). By truth, one means “reality” and not the concept “truth” as the result of a logic expression. This concept is overloaded too, so that it’s easy for people to manipulate a map with logical rules and then tell another person “You’re clearly not seeing the territory right”.
physics is more accurate than intuitive world models
Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, “There’s no one true map”. We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences. One of the limitation is “there’s no universal truth”, but this is not even a problem as the universe is finite. But “universal” in mathematics is assumed to be truly universal, covering all things, and it’s precisely this which is not possible. But we don’t notice, and thus come up with the illusion of uniqueness. And it’s this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it’s the consensus and not a consensus.
A good portion of hardcore rationalists tend to have something to protect, a humanistic cause
My bad for misrepresenting your position. Though I don’t agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an “improvement” of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it’s also a source of errors and unhappiness) what you’re left with is not human. It’s at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them. I just remembered seeing the quote “Rationality is winning”, and I’ll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that’s all. So you let other people be correct, and then you ask them for what you want, even if it’s completely unreasonable.
Rationality doesn’t necessarily have nature as a terminal value
I meant nature as its source (of evidence/truth/wisdom/knowledge). “Nature” meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what’s lacking in life and making those out to be virtue and the will of god.
None of that is incompatible with rationality
What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don’t seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There’s also an implicit bias towards alturism which cannot be derived from objective truth.
About my values, they already exist even if I’m not aware of them, they’re just unconscious until I make them conscious. But if system 1 functions well, then you don’t really need to train system 2 to function well, and it’s a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn’t come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.
The short version is that I’m not sold on rationality, and while I haven’t read 100% of the sequences it’s also not like my understanding is 0%. I’d have read more if they weren’t so long. And while an intelligent person can come up with intelligent ways of thinking, I’m not sure this is reversible. I’m also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.
Your last link needs an s before the dot.
Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there’s still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).
Hard disagree, there’s an entire field of psychology, decision theory and ethics using reflective equilibrium in light of science.
Well some things go wrong more often than other things, wisdom goes wrong a lot of time, it isn’t immune to memetic selection, there is not much mechanism to prevent you from falling for false memes. Technology after one point goes wrong wayyy less. A biology textbook is much more likely to be accurate and better on medical advice than a ayurvedic textbook.
Yes it’s a metaphor for adaptiveness, but I don’t understand where it may apply other than being a nice way to say “be more adaptive”. It’s like logical model like maths but for adaptiveness you import the idea of water-like adaptiveness into situations.
You know what might be an axiom of human cognition? Bayes rule and other axioms in statistics. I have found that I can bypass a lot of wisdom by using these axioms where others are stuck without a proper model in real life due to ancient wisdom. Eg; I stopped taking ayurvedic medication which contained carcinogens; when people spend hours thinking about certain principles in ethics or decision theory I know the laws to prevent such confusion etc
Honestly I agree with this part, I think this is the biggest weakness of rationalism. I think the failure to general purpose overcome akrasia is a failure of rationality. I find it hard to believe that there would be a person like david goggins but a rationalist. The obsession with accuracy doesn’t play well with romanticism of motivation and self-overcoming, it’s a battle you have to fight and figure out daily, and under the constraints of reality it becomes difficult.
There’s an entire field of psychology, yes, but most men are still confused by women saying “it’s fine” when they are clearly annoyed. Another thing is women dressing up because they want attention from specific men. Dressing up in a sexy manner is not a free ticket for any man to harass them, but socially inept men will say “they were asking for it” because the whole concept of selection and stardards doesn’t occur to them in that context. And have you read Niccolò Machiavelli’s “The Prince”? It predates psychology, but it is psychology, and it’s no worse than modern books on office politics and such, as far as I can tell. Some things just aren’t improving over time.
You gave the example of the ayurvedic textbook, but I’m not sure I’d call that “wisdom”. If we compare ancient medicine to modern medicine, then modern medicine wins in like 95% of cases. But for things relating to humanity itself, I think that ancient literature comes out ahead. Modern hard sciences like mathematics are too inhuman (autistic people are worse at socializing because they’re more logical and objective). And modern soft sciences are frankly pathetic quite often (Gardner’s Theory of Multiple Intelligences is nothing but a psychological defense against the idea that some people aren’t very bright. Whoever doesn’t realize this should not be in charge of helping other people with psychological issues)
It’s a core concept which applies to all areas of life. Humans won against other species because we were better at adapting. Nietzsche wrote “The snake which cannot cast its skin has to die. As well the minds which are prevented from changing their opinions; they cease to be mind”. This community speaks a lot of “updating beliefs” and “intellectual humility” because thinking that one has all the answers, and not updating ones beliefs over time, leads to cognitive inflexibility/stagnation, which prevents learning. Principles are incredibly powerful, and most human knowledge probably boils down to about 200 or 300 core principles.
Would I be right to guess that ancient wisdom fails you the most in objective areas of life, and that it hasn’t failed you much in the social parts? I don’t disagree that modern axioms can be useful, but I think there’s many areas where “intelligent” approaches leads to worse outcomes. For the most part, attempting to control things lead to failure. I’ve had more unpleasant experiences on heavily moderated platforms than I have had in completely unmoderated spaces. I think it’s because self-organization can take place once disturbance from the outside ceases. But we will likely never know.
You could put it like that. I’d say something like “The rules of the brain are different than those of math, if you treat the brain like it’s supposed to be rational, you will always find it to be malfunctioning for reasons that you don’t understand”. Too many geniuses have failed at living good lives for me to believe that intelligence is enough. I have friends with IQs above 145 who are depressed because they think too rationally to understand their own nature. They reject the things which could help them, because they look down on them as subjective/silly/irrational.
David Goggings story is pretty interesting. I can’t say I went through as much as him, but we do have things in common. This might be why I have the courage to criticize science on LW in the first place.
I think majority of people aren’t aware of psychology and various fields under it. Ethics and decision theory kind of give a lot of clarity into such decisions when you analyse the payoff matrix. I haven’t The prince but have read excerpts from it in self-improvement related diaspora, I am not denying the value which such literature gives us, I just think we should move on by learning from it and developing on top in light of newer methods.
Beside I am more of a moral anti-realist so lol. I don’t think there is universally compelling arguments for these ethical things, but people with enough common psychological and culture grounds can cooperate.
Well it depends on your definition of inhuman, my_inhuman =/= your_inhuman value is a two place function, my peers when I was in high school found at least one of the hard sciences fun. Like them I find hard sciences pretty cool to learn about for fulfilling my other goals.
Agreed. Some fields under psychology as pathetic. But the fields like cognitive biases etc are not.
Well astrology has clearly failed me, my mom often had these luddite-adjacent ideas about what I am meant to do in life because her entire source of ethics was astrology. Astrology in career advice is like rolling a dice and assigning the all the well known professions to a number rather than actual life satisfaction or value fulfillment.
I would strongly disagree on the front of intelligence . More Rational as in cognitive algorithms which tend to lead to systematic optimality in this case truth seeking/achieving goals is indeed possible and pretty much is a part of growth.
I would weakly disagree on the front of Internal family subsystems (with the internal double crux special case being extremely useful) and other introspective reductionist methods where you break down your emotional responses and process into parts and understand what you like/dislike and the various attempts to bridge the two. On this front there are plethora of competing theories due to easy problem of consciousness and trying to understand experience functionally.
And for brain not working as I want to be when I model other parts of this brain, I find it being emotionally engaged in things which aren’t optimal for some of my goals and it isn’t contradictory with rationality to acknowledge or deal with these feelings.
I was praising goggins because he’s more of the type who is willing to fight himself and in more than half of the introspective models that without acknowledgement is bordering on self-harm. I find his strategy to be intuitively much better lol.
Where I would agree is that if you don’t understand something then your theory is probably wrong. There are not confusing facts only models which are confused by facts.
I think growth is important, I like to think of it in intelligence being compute power and growth and learning being more of changing algorithms. Besides there is a good amount of coorelations with IQ you might want to look into, I think this area is very contentious (got a system1 response to check for the social norms due to past bans lol) , but we’re on lesswrong, so you can continue.
You’re welcome, maybe you should read sequence highlights to get introduced with LW’s POV to understand other people’s positions here.
I don’t think there’s a reason for most people to learn psychology or game theory, as you can teach basic human behaviour and such without the academic perspective. I even think it’s a danger to be more “book smart” than “street smart” about social things. So rather than teaching game theory in college, schools could make children read and write a book report on “How to Win Friends & Influence People” in 4th grade or whatever. Academic knowledge which doesn’t make it to 99% of the population doesn’t help ordinary people much. But a lot of this knowledge is simple and easier than the math homework children tend to struggle with.
I don’t particularly believe in morality myself, and I also came to the conclusion that having shared beliefs and values is really useful, even if it means that a large group of people are stuck in a local maximum. As a result of this, I’m against people forcing their “moral” beliefs on foreign groups, especially when these groups are content and functional already. So I reject any global consensus of what’s “good”. No language is more correct than another language, and the same applies for cultures and such.
It’s funny that you should link that post, since it introduces an idea that I already came up with myself. What I meant was that people tend to value what’s objective over what’s subjective, so that their rational thinking becomes self-destructive or self-denying in a sense. Rationality helps us to overcome our biases, but thinking of rationality as perfect and of ourselves as defect is not exactly healthy. A lot of people who think they’re “super-humans” are closer to being “half-humans”, since what they’re doing is closer to destroying their humanity than overcoming or going beyond it. And I’m saying this despite the fact that some of these people are better at climbing social hierarchies or getting rich than me. In short, the objective should serve the subjective, not the other way around. “The lenses which sees its own flaws” merely conditions itself to seeing flaws in everything. Some of my friends are artists, and they hate their own work because they’re good at spotting imperfections in it, I don’t consider this level of optimization to be any good for me. When I’m rational, it’s because it’s useful for me, so I’m not going to harm myself in order to become more rational. That’s like wanting money thinking it will make me happy, and then sacrificing my happiness in order to make money.
I’ll agree as long as these fields haven’t been subverted by ideologies or psychological copes against reality yet (as that’s what tend to make soft sciences pathetic). The “Tall poppy syndrome” has warped the publics perception of the “Dunning kruger effect”, so that it becomes an insult you can use against anyone you disagree with who are certain of themselves, especially in a social sitaution in which a majority disagree.
Astrology is wrong and unscientific, but I can see why it would originate. It’s a kind of pattern reocgnition gone awry. Since everything is related, and the brain is sometimes lazy and thinks that correlation=causation and that X implies Y is the same as Y implies X, they use patterns to predict things, and assume that recreating the patterns will recreate the things. This is mostly wrong, of course, but not always. People who are happy are likely to smile, but smiling actually tends to make you happier as well. Do you know the tragic story behind the person who invented handwashing? He found the right pattern, and the results were verifiable, but because his idea sounded silly, he ended up suffering.
If you had used astrology yourself, it might have ended better, as you’d be likely to intrepret what you wanted to to be true, and since your belief that your goal in life was fated to come true would help against the periodic doubt that people face in life.
Intelligent is not something you are, it’s something you have. Identifying with your intelligence is how you disown 90% of yourself. Seeing intelligence as something available to you rather than as something you are helps eliminate internal conflict. All “gifted kid burnout” and “depressed intelligent person” situation I have seen was partly caused by this dangerous identification. Even if you dismiss everything else I’ve said so far, I want to stress the importance of this one thing. Lastly, “systematic optimality” seems to suffer from something like Goodhart’s law. When you optimize for one variable, you may harm 100 other variables slightly without realizing it (paperclip optimizers seem like the mathematical limit of this idea). Holistic perspectives tend to go wrong less often.
I like the Internal Family Systems view. I think the brain has competing impulses whose strength depends on your physical and psychological needs. but while I think your brain is rational according to what it wants, I don’t think it’s rational according to what you want. In fact, I think peoples brains tend to toy them them completely. It creates suffering to motivate you, it creates anxiety to get you to defend yourself, it creates displeasure and tells you that you will be happy if you achieve your goals. Being happy all the time is easy, but our brain makes this hard to realize so that we don’t hack our own reward systems and die. If you only care about a few goals, your worldview is extremely simple. You have a complex life with millions of factors, but you only care about a few objective metrics? I’m personally glad that people who chase money or fame above all end up feeling empty, for you might as well just replace humanity with robots if you care so little for experiencing what life has to offer.
Oh, I know, I have a few bans from various websites myself (and I once got rate limited on here). And intelligence correlates with nihilism, meta-thinking, systemization, and anxiety (I know a study found the correlaton to mental illness to be false. But I think the correlation is negative until about 120 IQ and then positive after). But why did Nikola Tesla’s intelligence not prevent him from dying poor and lonely? Why was Einstein so awkward? Why do some many intelligent people not enjoy life very much? My answer is that these are consequences of lacking humanity / healthy ways of thinking. It’s not just that stupid people are delusional. I personally like the idea that intelligence comes at the cost of instinct. For reference, I used to think rationally, I hated the world, I hated people, I couldn’t make friends, I couldn’t understand myself. Now I’m completely fine, I even overcame depression. I don’t suffer and I don’t even dislike suffering, I love life, I like socializing. I don’t worry about injustice, immorality or death.
I just found a highlight of the sequences, and it turns out that I have read most of the posts already, or just discovered the principles myself previously. And I disagree with a few of the moral rules because they decrease my performance in life by making me help society. Finally, my value system is what I like, not what is mathematically optimal for some metric which people think could help society experience less negative emotions (I don’t even think this is true or desirable)
Honestly I don’t know enough about people to actually tell if that’s really the case for me book smart becomes street smart when I make it truly a part of me.
That’s how I live anyways. For me when you formalise streetsmart it becomes booksmarts to other people, and the latter is likely to yield better prediction aside from the places where you lack compute like in case of society where most of people don’t use their brain outside of social/consensus reality. So maybe you’re actually onto something here, along the lines of “don’t tell them the truth because they cannot handle it” lol.
Well since I wanted to dismantle the chesterton fence I did reach the similar conclusions as yours regarding why it came to be and why they (the ancients) fell for it, the correlation causation one is the general purpose one. One major reason was agriculture where it was likely to work well due to common cause of seasons and relative star movement. So it can also be thought of as faulty generalisation.
That’s false I wouldn’t have socially demotivated my mom using apathy from wasting too much money on astrology, if I had been enthusiastic about it it would have fueled into her desire. Astrology is like that false hope of lottery, waste of emotional energies.
I would have been likely to fall for other delusions surrounding astrology instead of spending that time learning about things for example going on to pilgrimage for few weeks before exams etc .
Besides astrology predicts everything on the list of usual human behavior and more or less ends up predicting nothing.
Well more or less rational is w.r.t. to cognitive algorithms, you tend to have one variable, achieving goals. And cognitive algorithms which are better at reaching certain goals are more rational w.r.t. to that goal.
There is a distinction made better truth oriented epistemic rationality and day-to-day life goal oriented instrumental rationality but for me they’re pretty similar that for epistemic the goal is truth.
I think the distinction was made because there’s significant amount of epistemics in rationality.
If your goal is optimising 100 variables then go with it. For a rationalist truth tends to be their highest instrumental value, that’s the main difference imo between a rationalist or say a post-rationalist or a pre-rationalist. They can have other terminal values above that like life,liberty and pursuit of happiness etc.
If you’re not aware with the difference between terminal and instrumental.
I think it again depends on value being 2 place function. Some people may find fulfillment from that. I have met some of them who’re like that. I think quite a bit of literature on the topic is a bit biased in favour of common morality.
I think you would need to provide evidence for such claims, my prior is set against such claims given the low amount of evidence I have encountered and I cannot update it just because some cultural wisdom said so, because cultural wisdom is often wrong.
Then you weren’t thinking rationally. To quote;
Also check firewalling the rational from the optimal and feeling rational .
Also check no one can exempt you from laws of rationality.
Well then you can be mathematically optimal for the other metric. Laws of decision theory don’t stop working if you change your utility function. Unless you want to get money pumped lol , in that case your preferences are circular. Yes you might argue that we’re not knowledgeable enough to figure out what our values will be in various subject areas, and there’s a reason we have an entire field of AI alignment due to various such issues, and there are various problems with inferring our desires, limits of introspection.
There’s a lot to unfold for this first point:
Another issue with teaching it academically is that academic thought, like I already said, frames things in a mathematical and thus non-human way. And treating people like objects to be manipulated for certain goals (a common consequence of this way of thinking) is not only bad taste, it makes the game of life less enjoyable.
Learning how to program has harmed my immersion in games, and I have a tendency to powergame, which makes me learn new videogames way faster than other people, also with the result that I’m having less fun than them. I think rationality can result in the same thing. Why do people dislike “sellouts” and “cars salesmen” if not for the fact they they simply optimize for gains in a way which conflicts with taste? But if we all just treat taste like it’s important, or refuse to collect so much information that we can see the optimal routes, then Moloch won’t be able to hurt us.
If you want something to be part of you, then you simply need to come up with it yourself, it will be your own knowledge. Learning other peoples knowledge however, feels to me like consuming something foreign.
Of course, my defense of ancient wisdom so far has simply been to translate it into an academic language in which it makes sense. “Be like water” is street-smarts, and “adaptability is a core component of growth/improvement/fitness” is the book-smarts. But the “street-smarts” version is easier to teach, and now that I think about it, that’s what the bible was for.
Most things that society waste its time discussing are wrong. And they’re wrong in the sense than even an 8-year-old should be able to see that all controversies going on right now are frankly nonsense. But even academics cannot seem to frame things in a way that isn’t riddled with contradictions and hypocrisy. Does “We are good, but some people are evil, and we need to fight evil with evil otherwise the evil people will win by being evil while we’re being good” not sound silly? A single thought will get you karl poppers “paradox of tolerance” and a single thought more will make you realize that it’s not a paradox but a kind of neutrality/reflexivity which make both sides equal, and that “We need to fight evil” means “We want our brand of evil to win” as long as people don’t dislike evil itself but rather how it’s used. Again, this is not more complicated than “I punched my little brother because I was afraid he’d punch me first, and punching is bad” which I expect most children to see the problem with.
The thought experiment I had in mind was limited to a single isolated situation, you took it much further, haha. My point was simply “If you use astrology for yourself, the outcomes are usually alright”. Same with tarot cards, as far as I’m concerned, it’s a way to talk with your subconsciousness without your ego getting in the way, which requires acting as if something else is present. Even crystal balls are probably a kind of Rorschach test, and should not be used to “read other people” for this reason. Finally, I don’t disagree with the low utility of astrology, but false hope gives people the same reassurance as real hope. People don’t suffer from the non-existence of god, but from the doubt of his existence. The actual truth value of beliefs have no psychological effects (proof: Otherwise we could use beliefs to measure the state of reality).
I disagree as I know of counter-examples. It’s more likely for somebody to become rich making music if their goal is simply to make music and enjoy themselves, than if their goal is to become rich making music. You see similar effects for people who try to get girlfriends, or happiness for that matter. If X resuls in Y, then you should optimize for X and not for Y. Many companies are dying because they don’t realize such a simple thing (they try to exploit something pre-existing rather than making more of what they’re exploiting, for instance the trust in previous IPs). Ancient wisdom tackles this. Wu Wei is about doing the right by not trying to do it. I don’t know how often this works, but it sometimes does.
I have to disagree that anyones goal is truth. I’ve seen strong evidence that knowledge of an environment is optimal for survival, and that knowledge-optimizing beats self-delusion every time, but even in this case, the real goal is “survival” and not “truth”. And my proof is the following: If you optimize for truth because it feels correct or because you believe it’s what’s best, then your core motivation is feelings or beliefs respectively. For similar reasons, non-egoism is trivially impossible. But the “Something to protect” link you sent seems to argue for this as well?
And truth is not always optimal for goals. The belief that you’re justified and the belief that you can do something are both helpful. The average person is 5⁄10 but tend to rate themself as 7⁄10, which may the around the optimal bias.
By the way, most of my disagreements so far seem to be “Well, that makes sense logically, but if you throw human nature into the equation then it’s wrong”
I find myself a little doubtful here. People usually chase fame not because they value it, but because other people seem to value it. They might even agree cognitively on what’s valuable, but it’s no use if they don’t feel it.
How many great peoples autobiographies and life stories have you read? The nearer you get to them, the more human they seem, and if you get too close you may even find yourself crushed by pity. About Isaac Newton, it was even said “As a man he was a failure; as a monster he was superb”. Boltzmann committed suicide, John Nash suffered from skizophrenia. Philosophy is even worse off, titles like “suicide or coffee?” do not come from healthy states of mind. And have you read the Vasistha Yoga? It’s basically poison. But it’s ultimately a projection, a worldview does not reveal the world, but rather than person with the worldview.
But what saved me was not changing my knowledge, but my interpretation of it. I was right that people lie a lot, but I thought it was for their own sake, when it’s mostly out of consideration for others. I was right that people were irrational, but I didn’t realize that this could be a good thing.
That seems like it’s saying “I define rationality as what’s correct, so rationality can never be wrong, because that would mean you weren’t being rational”. By treating rationality as something which is discovered rather than created (by creating a map and calling it the territory), any flaw can be justified as “that wasn’t real rationality, we just didn’t act completely rationally because we’re flawed human beings! (our map was simply wrong!)”.
There can be no universal knowledge, maps of the territory are inherently limited (and I can prove this). As far as rationality uses math and verbal or written communication, it can only approximate something which cannot be put into words “The dao of which can be spoken is not the dao” simply means “the map is not the territory”.
By the way, I think I’ve found a big difference between our views. You’re (as far as I can tell) optimizing for “Optimization power over reality / a more reliable map”, while I’m optimizing for “Biological health, psychological well-being and enjoyment of existence”.
And they do not seem to have as much in common as rationalists believe.
But if rationality in the end worships reality and nature, that’s quite interesting, because that puts it in the same boat as Taoism and myself. Some people even put Nature=God.
Finally, if my goal is being a good programmer, then a million factors will matter, including my mood, how much I sleep, how much I enjoy programming, and so on. But somebody who naively optimizes for progamming skills might practice at the cost of mood, sleep, and enjoyment, and thus ultimately end up with a mediocre result. So in this case, a heuristic like “Take care of your health and try to enjoy your life” might not lose out to a rat-race like mentality in performance. Meta-level knowledge might help here, but I still don’t think it’s enough. and the tendency to dismiss things which seem unlikely, illogical or silly is not as great as a heuristic as one would think, perhaps because any beliefs which manage to stay alive despite being silly have something special about them.
Yes intuitions can be wrong welcome to reality. Beside I think schools are bad at teaching things.
Yes the trick for that is to delete the piece of knowledge you learnt and ask the question, how could I have come up with this myself?
That just sounds to me like “we need wisdom because people cannot think” . Yes I would agree considering when you open reddit, twitter or any other platform you can find many biases being upvoted. I would agree memetic immune system is required for a person unaware of various background literature required to bootstrap rationality. I am not advocating for teaching anything I don’t have plans for being an activist or having will to change society. But consider this, if you know enough rationality you can easily get past all that.
Sure a person should be aware when they’re drifting from the crowd and not become a contrarian since reversed stupidity is not intelligence and if you dissent when you have overwhelming reason for it you’re going to have enough problems in your life
I would agree on the latter part regarding good/evil. Unlike other rationalist this is why I don’t have will to change society. Internet has killed my societal moral compass for good/evil however you may like to put it for being more egoistic. Good just carries a positive system 1 connotation for me, I am just emoting it, but I mostly focus on my life. Or you have to be brutally honest about it, I don’t care about society as long as my interests are being fulfilled.
Agreed, map is not the territory, it feels same to be wrong as it feels to be right.
Yes if someone isn’t passionate about such endeavours they may not have the will to sustain it. But if a person is totally apathetic to monetary concerns they’re not going to make it either. So a person may argue on a meta level it’s more optimal to be passionate about a field or choose a field you’re passionate about in which you want to do better , to overcome akrasia and there might be some selection bias at play where a person who’s good at something is likely to have positive feedback loop about the subject.
Yes, exactly, truth is in highest service to other goals if my phrasing of “highest instrumental value” wasn’t clear. But you don’t deliberately believe false things that’s what rationality is all about, truth is nice to have but usefulness is everything.
Believing false things purposefully is impossible either ways, you’re not anticipating it with high possibility. That’s not how rationalist belief works. When you believe something that’s how reality is to you, you look at the world through your beliefs.
Not many, but it would be unrepresentative to generalise from that.
Ethically yes, epistemically no. Reality doesn’t care, this is what society gets wrong, if I am disagreeing with your climate denial or climate catastrophism I am not proposing a what needs to be done, there is a divide between morals and epistemics.
Yes, finally you get my point. We label those things rationality, the things which work. Virtue of empiricism. Rationality is about having cognitive algorithms which have higher returns systematically on whatever is that thing you want.
I would disagree, physics is more accurate than intuitive world models. The act of guessing a hypothesis is reverse engineering experience so to speak, you get a causal model which is connected to you in form of anticipations (this link is part of a sequence so there’s a chance there’s lot of background info).
When you experience something your brain forms various models of it, and you look at the world through your beliefs.
That’s misrepresentation of my position I said truth is my highest instrumental value not highest terminal value. Besides good portion of hardcore rationalists tend to have something to protect, a humanistic cause, which they devote themselves to, that tends to be aligned with their terminal values however they may see fit. Others may solely focus on their own interests like health,life and wellbeing.
To reiterate, you only seek truth as much as it allows you to get what you want but you don’t believe in falsities. That’s it.
Rationality doesn’t necessarily have nature as a terminal value, rationality is a tool, the set of cognitive algorithms which work for whatever you want with truth being highest instrumental value. As you might have read in the something to protect article.
Rationalists tend to have heavy respect for cognitive algorithms which allow us to systematically get us what we desire. They’re disturbed if there’s a violation in the process which gets us there.
None of that is incompatible with rationality, rather rationality will help you get there. Heuristics like “take care of your health and try to enjoy life” seem more of vague plans to fulfill your complex set of values which one may discover more about. Values are complex and there are various posts you can find here which may help you model yourself better and reach reflective equilibrium which is the best you can do either ways both epistemically and morally (former (epistemics) of which is much more easily reached by focusing on getting better with w.r.t. your values than focusing solely on it as highlighted by the post since truth is only instrumental) .
Edit: added some more links fixed some typos.
But these ways of looking at the world are not factually wrong, they’re just perverted in a sense.
I agree that schools are quite terrible in general.
That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.
In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike “nerdy” subjects, and it’s much more likely that they’d listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an “easy version” of knowledge available which requires 20 IQ points less than the hard version seems like a good idea.
Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your “Things to unlearn from school” post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.
I don’t think “rationality” is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses).
But we’re both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.
I’m not sure what you mean by “when you dissent when you have an overwhelming reason”. The article you linked to worded it “only when”, as if one should dissent more often, but it also warns against dissenting since it’s dangerous.
By the way, I don’t like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I’ve experienced is also quite strong, which is strange since you’d suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.
I also don’t like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there’s no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they’re neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don’t act like I don’t realize what I’m doing.
I think this is mostly correct. But optimization can kill passion (since you’re just following the meta and not your own desires). And common wisdom says “Follow your dreams” which is sort of naive and sort of valid at the same time.
I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you’re between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm:
Say “X is true because” and then allow your brain to search through your memoy for evidence. It will find them.
The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they’re just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don’t think the brain differentiates subjective and objective things, it doesn’t even know the difference.
And it doesn’t seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?
You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say “Life is good” and a depression person might say “Life is cruel”, and they might even know the same facts.
Online “black pills” are dangerous, because the truth value of the knowledge doesn’t imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don’t need to refute it for its negative tone to be false.
But then it’s not about maximizing truth, virtue, or logic.
If reality operates by different axioms than logic, then one should not be logical.
The word “virtue” is overloaded, so people write like the word is related to morality, but it’s really just about thinking in ways which makes one more clear-sighted. So people who tell me to have “humility” are “correct” in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they’re better people than me (as if I’ve made an ethical/moral mistake in being stubborn or certain of myself).
By truth, one means “reality” and not the concept “truth” as the result of a logic expression. This concept is overloaded too, so that it’s easy for people to manipulate a map with logical rules and then tell another person “You’re clearly not seeing the territory right”.
Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, “There’s no one true map”. We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences.
One of the limitation is “there’s no universal truth”, but this is not even a problem as the universe is finite. But “universal” in mathematics is assumed to be truly universal, covering all things, and it’s precisely this which is not possible. But we don’t notice, and thus come up with the illusion of uniqueness. And it’s this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it’s the consensus and not a consensus.
My bad for misrepresenting your position. Though I don’t agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an “improvement” of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it’s also a source of errors and unhappiness) what you’re left with is not human. It’s at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them.
I just remembered seeing the quote “Rationality is winning”, and I’ll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that’s all. So you let other people be correct, and then you ask them for what you want, even if it’s completely unreasonable.
I meant nature as its source (of evidence/truth/wisdom/knowledge). “Nature” meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what’s lacking in life and making those out to be virtue and the will of god.
What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don’t seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There’s also an implicit bias towards alturism which cannot be derived from objective truth.
About my values, they already exist even if I’m not aware of them, they’re just unconscious until I make them conscious. But if system 1 functions well, then you don’t really need to train system 2 to function well, and it’s a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn’t come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)
Honestly majority of the points presented here are not new and already been addressed in
https://www.lesswrong.com/rationality
or https://www.readthesequence.com/
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.
The short version is that I’m not sold on rationality, and while I haven’t read 100% of the sequences it’s also not like my understanding is 0%. I’d have read more if they weren’t so long. And while an intelligent person can come up with intelligent ways of thinking, I’m not sure this is reversible. I’m also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.
Your last link needs an s before the dot.
Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there’s still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).