Something has been bothering me ever since I began to try to implement many of the lessons in rationality here. I feel like there needs to be an emotional reinforcement structure or a cognitive foundation that is both pliable and supportive of truth seeking before I can even get into the why, how and what of rationality. My successes in this area have been only partial but it seems like the better well structured the cognitive foundation is the easier it is to adopt, discard and manipulate new ideas.
I understand that is likely a fairly meta topic and would likely require at least some basic rationality to bootstrap into existence but I am going to try to define the problem. What is this necessary cognitive foundation? And then break it down into pieces. I suspect that much of this lies in subverbal emotional and procedural cues but if so how can they be more effectively trained?
I think your phrasing of your question is confusing. Are you asking for help putting yourself into a mindset conducive to learning and developing rationality skills?
Let me see if I can be more clear. In my experience I have an emotional framework with which I hang beliefs from. Each belief has specific emotional reinforcement or structure that allows me to believe it. If I revoke that reinforcement then very soon after I find that I no longer hold that belief. I guess the question I should ask first is that is this emotional framework real? Did I make it up? And it is real then how can I use it to my advantage?
How did I build this framework and how do I revoke emotional support? I have good reason to think that the framework isn’t simply natural to me since it has changed so much over time.
One technique I use to internalize certain beliefs is to determine their implied actions, then take those actions while noting that they’re the sort of actions I’d take if I “truly” believed. Over time the belief becomes internal and not something I have to recompute every time a related decision comes up. I don’t know precisely why this works but my theory is that it has to do with what I perceive my identity to be. Often this process exposes other actions I take which are not in line with the belief. I’ve used this for things like “animal suffering is actually bad”, “FAI is actually important”, and “I actually need to practice to write good UIs”.
This is similar to my experience. Perhaps a better way to express my problem is this. What are the some safe and effective way to construct and dismantle identity? And what sorts of identity are most able to incorporate new information and process them into rational beliefs? One strategy I have used in the past is to simply not claim ownership of any belief so that I might release it more easily but in this I run into a lack of motivation when I try to act on these beliefs. On the other hand if I define my identity based on a set of beliefs then any threat to them is extremely painful.
That was my original question, how can I build an identity or cognitive foundation that motivates me but is not painfully threatened by counter evidence?
The litany of Tarski and the litany of Gendlin exemplify a pretty good attitude to cultivate. (Check out the posts linked in the Litany of Gendlin wiki article; they’re quite relevant too. After that, the sequence on How to Actually Change Your Mind contains still more helpful analysis and advice.)
This can be one of the toughest hurdles for aspiring rationalists. I want to emphasize that it’s OK and normal to have trouble with this, that you don’t have to get everything right on the first try (and to watch out if you think you do), and that eventually the world will start making sense again and you’ll see it was well worth the struggle.
The emotional framework of which you speak doesn’t seem to resemble anything I can introspectively access in my head, but maybe I can offer advice anyway. Some emotional motivations that are conducive to rationality are curiosity, and the powerful need to accomplish some goal that might depend on you acting rationally.
I have read pretty much everything more than once. It is pretty difficult to turn reading into action though. Which is why I feel like there is something I am missing. Yep.
Something has been bothering me ever since I began to try to implement many of the lessons in rationality here. I feel like there needs to be an emotional reinforcement structure or a cognitive foundation that is both pliable and supportive of truth seeking before I can even get into the why, how and what of rationality. My successes in this area have been only partial but it seems like the better well structured the cognitive foundation is the easier it is to adopt, discard and manipulate new ideas.
I understand that is likely a fairly meta topic and would likely require at least some basic rationality to bootstrap into existence but I am going to try to define the problem. What is this necessary cognitive foundation? And then break it down into pieces. I suspect that much of this lies in subverbal emotional and procedural cues but if so how can they be more effectively trained?
I think your phrasing of your question is confusing. Are you asking for help putting yourself into a mindset conducive to learning and developing rationality skills?
Let me see if I can be more clear. In my experience I have an emotional framework with which I hang beliefs from. Each belief has specific emotional reinforcement or structure that allows me to believe it. If I revoke that reinforcement then very soon after I find that I no longer hold that belief. I guess the question I should ask first is that is this emotional framework real? Did I make it up? And it is real then how can I use it to my advantage?
How did I build this framework and how do I revoke emotional support? I have good reason to think that the framework isn’t simply natural to me since it has changed so much over time.
One technique I use to internalize certain beliefs is to determine their implied actions, then take those actions while noting that they’re the sort of actions I’d take if I “truly” believed. Over time the belief becomes internal and not something I have to recompute every time a related decision comes up. I don’t know precisely why this works but my theory is that it has to do with what I perceive my identity to be. Often this process exposes other actions I take which are not in line with the belief. I’ve used this for things like “animal suffering is actually bad”, “FAI is actually important”, and “I actually need to practice to write good UIs”.
This is similar to my experience. Perhaps a better way to express my problem is this. What are the some safe and effective way to construct and dismantle identity? And what sorts of identity are most able to incorporate new information and process them into rational beliefs? One strategy I have used in the past is to simply not claim ownership of any belief so that I might release it more easily but in this I run into a lack of motivation when I try to act on these beliefs. On the other hand if I define my identity based on a set of beliefs then any threat to them is extremely painful.
That was my original question, how can I build an identity or cognitive foundation that motivates me but is not painfully threatened by counter evidence?
The litany of Tarski and the litany of Gendlin exemplify a pretty good attitude to cultivate. (Check out the posts linked in the Litany of Gendlin wiki article; they’re quite relevant too. After that, the sequence on How to Actually Change Your Mind contains still more helpful analysis and advice.)
This can be one of the toughest hurdles for aspiring rationalists. I want to emphasize that it’s OK and normal to have trouble with this, that you don’t have to get everything right on the first try (and to watch out if you think you do), and that eventually the world will start making sense again and you’ll see it was well worth the struggle.
The emotional framework of which you speak doesn’t seem to resemble anything I can introspectively access in my head, but maybe I can offer advice anyway. Some emotional motivations that are conducive to rationality are curiosity, and the powerful need to accomplish some goal that might depend on you acting rationally.
How much of the Sequences have you read? A lot of them are about, essentially, how to feel like a rationalist.
I have read pretty much everything more than once. It is pretty difficult to turn reading into action though. Which is why I feel like there is something I am missing. Yep.