There is nothing about being a rationalist that says that you can’t believe in God. I think the key point of rationality is to believe in the world as it is rather than as you might imagine it to be, which is to say that you believe in the existence of things due to the weight of evidence.
Ask yourself: do you want to believe in things due to evidence?
If the answer is no, then you have no right calling yourself a “wannabe rationalist” because, quite simply, you don’t want to hold rational beliefs.
If the answer is yes, then put this into practice. Is the moon smaller than the earth? Does Zeus exist? Does my toaster still work? In each case, what is the evidence?
If you find yourself believing something that you know most rationalists don’t believe in, and you think you’re basing your beliefs on solid evidence and logical reasoning, then by all means come and tell us about it! At that point we can get into the details of your evidence and the many more subtle points of rational reasoning in order to determine whether you really do have a good case. If you do, we will believe.
I… I don’t think I do want to believe in things due to evidence. Not deep down inside.
When choosing my beliefs, I use a more important criterion than mere truth. I’d rather believe, quite simply, in whatever I need to believe in order to be happiest. I maximize utility, not truth.
I am a huge fan of lesswrong, quoting it almost every day to increasingly annoyed friends and relatives, but I am not putting much of what I read there into practice, I must admit. I read it more for entertainment than enlightenment.
And I take notes, for those rare cases in my life where truth actually is more important to my happiness than social conventions: when I encounter a real-world problem that I actually want to solve. This happens less often than you might think.
Here’s another set of downvotes I don’t get (ETA: parent was at −2 when I arrived). Gelisam is just stating their personal experience, not in order to claim we must all do likewise, but as their own reaction to the debate.
I think this community would be ill served by a norm that makes it a punishable offense to ever admit one doesn’t strive for truth as much as one ought.
As far as replies go:
I’d rather believe, quite simply, in whatever I need to believe in order to be happiest.
It’s not so simple. If you’re self-deceiving, you might be quite wrong about whether your beliefs actually make you happier! There’s a very relevant post on doublethink.
Here’s another set of downvotes I don’t get. Gelisam is just stating their personal experience, not in order to claim we must all do likewise, but as their own reaction to the debate.
I think this community would be ill served by a norm that makes it a punishable offense to ever admit one doesn’t strive for truth as much as one ought.
Ah, so that’s why people downvoted my comment! Thanks for explaining. I thought it was only because I appeared to be confusing utilons with hedons.
Regarding the doublethink post, I agree that I couldn’t rationally assign myself false but beneficial beliefs, and I feel silly for writing that I could. On the other hand, sometimes I want to believe in false but beneficial beliefs, and that’s why I can’t pretend to be an aspiring rationalist.
“Maximizing truth” doesn’t make any sense. You can’t maximize truth. You can improve your knowlege of the truth, but the truth itself is independent of your brain state.
In any case, when is untruth more instrumental to your utility function than truth?
Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
I think it’s fairly obvious that “maximizing truth” meant “maximizing the correlation between my beliefs and truth”.
Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.
And before you complain that this doesn’t make any sense either, let me spell out that is an estimate of the probability that the strategy “pursue truth first, happiness second” yields, on average, more hedons than “pursue happiness using the current set of beliefs”.
When choosing my beliefs, I use a more important criterion than mere truth. I’d rather believe, quite simply, in whatever I need to believe in order to be happiest. I maximize utility, not truth.
Have you ever the experience of learning something true that you would rather not have learned? The only type of examples I can think of here (of the top of my head) would be finding out you had an unfaithful lover, or that you were really adopted. But in both case, it seems like the ‘unhappiness’ you get from learning it would pass and you’d be happy that you found out in the long wrong.
I’ve heard people say similar things about losing the belief in God—because it could lead to losing (or at least drifting away from) people you hold close, if their belief in God had been an import thing in their relationship to you.
Have you ever the experience of learning something true that you would rather not have learned?
Yes. Three times, in fact. Two of them are of roughly the same class as that one thing floating around, and the third is of a different class and far worse than the other two (involving life insurance and charity: you’ll find it if you look).
There is nothing about being a rationalist that says that you can’t believe in God. I think the key point of rationality is to believe in the world as it is rather than as you might imagine it to be, which is to say that you believe in the existence of things due to the weight of evidence.
Ask yourself: do you want to believe in things due to evidence?
If the answer is no, then you have no right calling yourself a “wannabe rationalist” because, quite simply, you don’t want to hold rational beliefs.
If the answer is yes, then put this into practice. Is the moon smaller than the earth? Does Zeus exist? Does my toaster still work? In each case, what is the evidence?
If you find yourself believing something that you know most rationalists don’t believe in, and you think you’re basing your beliefs on solid evidence and logical reasoning, then by all means come and tell us about it! At that point we can get into the details of your evidence and the many more subtle points of rational reasoning in order to determine whether you really do have a good case. If you do, we will believe.
Uh-oh.
I… I don’t think I do want to believe in things due to evidence. Not deep down inside.
When choosing my beliefs, I use a more important criterion than mere truth. I’d rather believe, quite simply, in whatever I need to believe in order to be happiest. I maximize utility, not truth.
I am a huge fan of lesswrong, quoting it almost every day to increasingly annoyed friends and relatives, but I am not putting much of what I read there into practice, I must admit. I read it more for entertainment than enlightenment.
And I take notes, for those rare cases in my life where truth actually is more important to my happiness than social conventions: when I encounter a real-world problem that I actually want to solve. This happens less often than you might think.
Here’s another set of downvotes I don’t get (ETA: parent was at −2 when I arrived). Gelisam is just stating their personal experience, not in order to claim we must all do likewise, but as their own reaction to the debate.
I think this community would be ill served by a norm that makes it a punishable offense to ever admit one doesn’t strive for truth as much as one ought.
As far as replies go:
It’s not so simple. If you’re self-deceiving, you might be quite wrong about whether your beliefs actually make you happier! There’s a very relevant post on doublethink.
Agreed.
Ah, so that’s why people downvoted my comment! Thanks for explaining. I thought it was only because I appeared to be confusing utilons with hedons.
Regarding the doublethink post, I agree that I couldn’t rationally assign myself false but beneficial beliefs, and I feel silly for writing that I could. On the other hand, sometimes I want to believe in false but beneficial beliefs, and that’s why I can’t pretend to be an aspiring rationalist.
“Maximizing truth” doesn’t make any sense. You can’t maximize truth. You can improve your knowlege of the truth, but the truth itself is independent of your brain state.
In any case, when is untruth more instrumental to your utility function than truth? Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
I think it’s fairly obvious that “maximizing truth” meant “maximizing the correlation between my beliefs and truth”.
Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.
And before you complain that this doesn’t make any sense either, let me spell out that is an estimate of the probability that the strategy “pursue truth first, happiness second” yields, on average, more hedons than “pursue happiness using the current set of beliefs”.
Have you ever the experience of learning something true that you would rather not have learned? The only type of examples I can think of here (of the top of my head) would be finding out you had an unfaithful lover, or that you were really adopted. But in both case, it seems like the ‘unhappiness’ you get from learning it would pass and you’d be happy that you found out in the long wrong.
I’ve heard people say similar things about losing the belief in God—because it could lead to losing (or at least drifting away from) people you hold close, if their belief in God had been an import thing in their relationship to you.
Yes. Three times, in fact. Two of them are of roughly the same class as that one thing floating around, and the third is of a different class and far worse than the other two (involving life insurance and charity: you’ll find it if you look).