At the moment, LW has provided negative benefit to my life. I recently quit my job to start learning positive psychology. My initial goal was to blog about positive psychology, and eventually use my blog as a platform to sell a book.
LW has made me deeply uncertain of the accuracy of the research I read, the words I write on my blog, and the advice I am writing in the book I intend to sell. Long-term, the uncertainty will probably help me by making me more knowledgeable than my peers, but in the short-term, demotivates (e.g. if I was sure what I was learning was correct, I would enthusiastically proselytize, which is a much more effective blogging strategy).
Still, I read on, because I’ve passed the point of ignorance.
I also think that LW has provided negative benefit to my life. Since I decided that I wanted my beliefs to be true, rather than pleasing to me, I’ve felt less connected to my friendship group. I used to have certain politcal views that a lot of my friends approved of. Now, I think I was wrong about many things (not totally wrong, but I’m far less confident of the views that I continue to hold). Overall, I’d rather believe true things, but I think so far it’s made me less happy.
No, I would not not-prefer to believe true things.
That said I also don’t experience believing true things as making me unhappy the way you describe.
It’s the combination of those statements that intrigues me: X makes you unhappy and you would rather do X. So I was curious as to why you would rather do it.
I have to admit, though, your answers leave me even more puzzled.
4.So, I suppose in some ways, feeling that my beliefs are more accurate has given me some sort of satisfaction. I don’t know it it outweigh’s feeling disconnected socially, though.
5.Altruism. I used to put a lot of energy into UK politics. I gained moral satisfaction and approval from my friends for this, but I’ve come to think that it’s really not a very effective way of improving the world. I would rather learn about more effective ways of making the world better (eg, donating to efficient charity).
Does that make sense? If you did feel that believing true things made you unhappy, would you try to make yourself belief not-true but satisfying things?
Altruism makes some sense to me as an answer… if you’re choosing to sacrifice your own happiness in order to be more effective at improving the world, and believing true things makes you more effective at improving the world, then that’s coherent.
Unrelatedly, if the problem is social alienation, one approach is to find a community in which the things you want to do (including believe true things) are socially acceptable.
If you did feel that believing true things made you unhappy, would you try to make yourself belief not-true but satisfying things?
There are areas in which I focus my attention on useful and probably false beliefs, like “I can make a significant difference in the world if I choose to take action.” It’s not clear to be that I believe those things, though. It’s also not clear to me that it matters whether I believe them or not, if they are motivating my behavior just the same.
That’s how I felt for the first few months after discovering that Jesus wasn’t magic after all. At that moment, all I could see was that (1) my life up to that point had largely been wasted on meaningless things, (2) my current life plans were pointless, (3) my closest relationships were now strained, and (4) much of my “expertise” was useless.
I’m tempted to conclude that your current accumulated utility given LW is lower than given (counterfactual no-LW), but that in counterpart/compensation your future expected utility has risen considerably by unknown margins with a relatively high confidence.
Is this an incorrect interpretation of the subtext? Am I reading too much into it?
I’ve noticed that I don’t even need to be knowledge to gain utility—there is a strong correlation between the signaling of my ‘knowledgeableness’ and the post popularity—the most popular had the largest number of references (38), and so on. When writing the post, I just hide the fact that I researched so much because of my uncertainty :)
At the moment, LW has provided negative benefit to my life. I recently quit my job to start learning positive psychology. My initial goal was to blog about positive psychology, and eventually use my blog as a platform to sell a book.
LW has made me deeply uncertain of the accuracy of the research I read, the words I write on my blog, and the advice I am writing in the book I intend to sell. Long-term, the uncertainty will probably help me by making me more knowledgeable than my peers, but in the short-term, demotivates (e.g. if I was sure what I was learning was correct, I would enthusiastically proselytize, which is a much more effective blogging strategy).
Still, I read on, because I’ve passed the point of ignorance.
I also think that LW has provided negative benefit to my life. Since I decided that I wanted my beliefs to be true, rather than pleasing to me, I’ve felt less connected to my friendship group. I used to have certain politcal views that a lot of my friends approved of. Now, I think I was wrong about many things (not totally wrong, but I’m far less confident of the views that I continue to hold). Overall, I’d rather believe true things, but I think so far it’s made me less happy.
Why would you rather believe true things?
1.I would just rather know the right answer!
2.I think believing true things has better consequences than the reverse, for many people. I’m not sure if it will for me.
3.It’s too late. I can’t decide to go back to believing things that aren’t true to make me feel better, because I’d know that’s what I was doing.
Would you not prefer to believe true things?
No, I would not not-prefer to believe true things.
That said I also don’t experience believing true things as making me unhappy the way you describe.
It’s the combination of those statements that intrigues me: X makes you unhappy and you would rather do X. So I was curious as to why you would rather do it.
I have to admit, though, your answers leave me even more puzzled.
Here are a couple of other reasons:
4.So, I suppose in some ways, feeling that my beliefs are more accurate has given me some sort of satisfaction. I don’t know it it outweigh’s feeling disconnected socially, though.
5.Altruism. I used to put a lot of energy into UK politics. I gained moral satisfaction and approval from my friends for this, but I’ve come to think that it’s really not a very effective way of improving the world. I would rather learn about more effective ways of making the world better (eg, donating to efficient charity).
Does that make sense? If you did feel that believing true things made you unhappy, would you try to make yourself belief not-true but satisfying things?
Altruism makes some sense to me as an answer… if you’re choosing to sacrifice your own happiness in order to be more effective at improving the world, and believing true things makes you more effective at improving the world, then that’s coherent.
Unrelatedly, if the problem is social alienation, one approach is to find a community in which the things you want to do (including believe true things) are socially acceptable.
There are areas in which I focus my attention on useful and probably false beliefs, like “I can make a significant difference in the world if I choose to take action.” It’s not clear to be that I believe those things, though. It’s also not clear to me that it matters whether I believe them or not, if they are motivating my behavior just the same.
That’s how I felt for the first few months after discovering that Jesus wasn’t magic after all. At that moment, all I could see was that (1) my life up to that point had largely been wasted on meaningless things, (2) my current life plans were pointless, (3) my closest relationships were now strained, and (4) much of my “expertise” was useless.
Things got better after a while.
I’m tempted to conclude that your current accumulated utility given LW is lower than given (counterfactual no-LW), but that in counterpart/compensation your future expected utility has risen considerably by unknown margins with a relatively high confidence.
Is this an incorrect interpretation of the subtext? Am I reading too much into it?
That interpretation is correct.
I’ve noticed that I don’t even need to be knowledge to gain utility—there is a strong correlation between the signaling of my ‘knowledgeableness’ and the post popularity—the most popular had the largest number of references (38), and so on. When writing the post, I just hide the fact that I researched so much because of my uncertainty :)