(I’m teasing you to some extent. What I regard to be the answers to many of the questions I’m asking can be found in the Sequences.)
I know the answers to most of these questions can be found in the sequences because I read them. However the sequences include quite a bit of information and it is clear not all or probably even most made it into the way I think. You asking me these questions is extremely helpful to me filling in those gaps and I appreciate it.
Why do you believe that? Even given that you believe this is currently true, do you think this is something you should change about yourself, and if not, why?
I believe that because I do not have the mental discipline required to both know a belief is false and still gain happiness from that belief.
It is possible that I could change this about myself but I don’t see myself ever learning the discipline required to lie to myself (if doublethink is actually possible).
Its also possible to go the other-way and say that something injured my brain and brought my intelligence to a level that I could no longer see why i should think one way instead of another, or not being able to see the truth vs. happiness decision which would let me pick happiness without lying to myself.
I think that most of two is based off of that heuristic which allows you to gain evidence for the claim even though it remains unfalsafiable and only weak evidence.
You asking me these questions is extremely helpful to me filling in those gaps and I appreciate it.
Glad to hear that. I was afraid I might be being a little too harsh.
I believe that because I do not have the mental discipline required to both know a belief is false and still gain happiness from that belief.
I guess I should clarify what I was trying to say. If you optimize for truth and not happiness, you will seek out a whole bunch of truths whether or not you expect that knowing those truths will make you happier. If you optimize for happiness and not truth, you’ll only seek truths that will help make you happier. I’m not asking you to consider explicitly lying to yourself, which is in some sense hard, but I’m asking you to consider the implications of optimizing for truth vs. optimizing for happiness.
Whether or not you do, most people do not optimize for truth. Do you think this is a good thing or a bad thing, and in either case, why?
I think that most of two is based off of that heuristic which allows you to gain evidence for the claim even though it remains unfalsafiable and only weak evidence.
I think you’ve lost track of why we were talking about Einstein. In the original post, you listed two reasons to believe non-falsifiable things. I asked you to give an example of the first one. Maybe it wasn’t sufficiently clear that I was asking for an example which wasn’t falsifiable, in which case I apologize, but I was (after all, that’s why it came up in the first place). Relativity is falsifiable. A heuristic that beautiful things tend to be true is also falsifiable.
quote break
Whether or not you do, most people do not optimize for truth. Do you think this is a good thing or a bad thing, and in either case, why?
Perhaps it would be easier for me to replace the word happiness with Awesomeness in which case I could see the argument that optimizing for awesomeness would let me seek out ways to make the world more awesome and would allow specific circumstances of what i consider awesome to be to govern which truths to seek out. In this way I can understand optimizing for awesomeness.
I think it is a good thing most people do not optimize for truth because if it were so I don’t think the resulting world would be awesome. It would be a world where many people were less happy even though it would also probably be a world with more scientific advances.
I suppose that if anyone were to optimize for truth it would be a minority who wanted to advance science further to make the general population more happy while the scientist themselves were not always. Even in this case I could understand the argument that they were optimizing awesomeness not truth because they thought the resulting world would be more awesome.
I still don’t see how anything you’ve said about Einstein is relevant to the original question I asked, which was for an example of a belief that you thought was beautiful, non-falsifiable, and worth holding.
I think it is a good thing most people do not optimize for truth because if it were so I don’t think the resulting world would be awesome. It would be a world where many people were less happy even though it would also probably be a world with more scientific advances.
Cool. So we agree now that truth does not trump awesomeness? (Somewhat tangential comment: science is not the only way to seek out truth. I also have in mind things like finding out whether you were adopted.)
You’re right Einstien was not relevant to your original question. I brought him up because I did not understand the question until
I think you’ve lost track of why we were talking about Einstein. In the original post, you listed two reasons to believe non-falsifiable things. I asked you to give an example of the first one. Maybe it wasn’t sufficiently clear that I was asking for an example which wasn’t falsifiable, in which case I apologize, but I was (after all, that’s why it came up in the first place). Relativity is falsifiable. A heuristic that beautiful things tend to be true is also falsifiable.
Thanks for leading me to the conclusion truth does not trump awesomeness and yes I now agree with this.
I also have in mind things like finding out whether you were adopted
I know the answers to most of these questions can be found in the sequences because I read them. However the sequences include quite a bit of information and it is clear not all or probably even most made it into the way I think. You asking me these questions is extremely helpful to me filling in those gaps and I appreciate it.
I believe that because I do not have the mental discipline required to both know a belief is false and still gain happiness from that belief. It is possible that I could change this about myself but I don’t see myself ever learning the discipline required to lie to myself (if doublethink is actually possible). Its also possible to go the other-way and say that something injured my brain and brought my intelligence to a level that I could no longer see why i should think one way instead of another, or not being able to see the truth vs. happiness decision which would let me pick happiness without lying to myself.
I think that most of two is based off of that heuristic which allows you to gain evidence for the claim even though it remains unfalsafiable and only weak evidence.
Glad to hear that. I was afraid I might be being a little too harsh.
I guess I should clarify what I was trying to say. If you optimize for truth and not happiness, you will seek out a whole bunch of truths whether or not you expect that knowing those truths will make you happier. If you optimize for happiness and not truth, you’ll only seek truths that will help make you happier. I’m not asking you to consider explicitly lying to yourself, which is in some sense hard, but I’m asking you to consider the implications of optimizing for truth vs. optimizing for happiness.
Whether or not you do, most people do not optimize for truth. Do you think this is a good thing or a bad thing, and in either case, why?
What is this referring to?
quote break
Perhaps it would be easier for me to replace the word happiness with Awesomeness in which case I could see the argument that optimizing for awesomeness would let me seek out ways to make the world more awesome and would allow specific circumstances of what i consider awesome to be to govern which truths to seek out. In this way I can understand optimizing for awesomeness.
I think it is a good thing most people do not optimize for truth because if it were so I don’t think the resulting world would be awesome. It would be a world where many people were less happy even though it would also probably be a world with more scientific advances.
I suppose that if anyone were to optimize for truth it would be a minority who wanted to advance science further to make the general population more happy while the scientist themselves were not always. Even in this case I could understand the argument that they were optimizing awesomeness not truth because they thought the resulting world would be more awesome.
I still don’t see how anything you’ve said about Einstein is relevant to the original question I asked, which was for an example of a belief that you thought was beautiful, non-falsifiable, and worth holding.
Cool. So we agree now that truth does not trump awesomeness? (Somewhat tangential comment: science is not the only way to seek out truth. I also have in mind things like finding out whether you were adopted.)
You’re right Einstien was not relevant to your original question. I brought him up because I did not understand the question until
Thanks for leading me to the conclusion truth does not trump awesomeness and yes I now agree with this.
Good point