I am wary of the connotation that I need someone to help me decide whether my feelings align with the Truth.
I’m not sure what this means. Could you elaborate?
What I imagine you to mean seems similar to the sentiment expressed in the first comment to this blog post. That comment seems to me to be so horrifically misguided that I had a strong physiological response to reading it. Basically the commenter thought that since he doesn’t experience himself as following rules of formulating thoughts and sentences, he doesn’t follow them. This is a confusion of the map and territory that stuck in my memory for some reason, and your comment reminded me of it because you seem to be expressing a very strong faith in the accuracy of how things seem to you.
Feel free to just explain yourself without feeling obligated to read a random blog post or telling me how I am misreading you, which would be a side issue.
I think my response to lukeprog above answers this in a way, but it’s more just a question of what we mean by “help me decide.” I’m not against people helping me be less wrong about the actual content of the territory. I’m just against people helping me decide how to emotionally respond to it, provided we are both already not wrong about the territory itself.
If I am happy because I have plenty of food (in the map), but I actually don’t (in the territory), I’d certainly like to be informed of that. It’s just that I can handle the transition from happy to “oh shit!” all by myself, thank you very much.
In other words, my suspicion of anyone calling themselves an Empathetic Metaethicist is that they’re going to try to slide in their own approved brand of ethics through the back door. This is also a worry I have about CEV. Hopefully future posts will alleviate this concern.
If you mean that in service of my goal of satisfying my actual desires, there is more of a danger of being misled when getting input from others as to whether my emotions are a good match for reality than when getting input as to whether reality matches my perception of it, I tentatively agree.
If you mean that getting input from others as to whether my emotions are a good match for reality has a greater cost than benefit, I disagree assuming basic advice filters similar to those used when getting input as to whether reality matches my perception of it. As per above, there will all else equal be a lower expected payoff for me getting advice in this area, even though the advantages are similar.
If you mean that there is a fundamental difference in kind between matching perception to reality and emotions to perceptions that makes getting input an act that is beneficial in the former case and corrosive in the latter, I disagree.
I have low confidence regarding what emotions are most appropriate for various crises and non-crises, and suspect what I think of as ideal are at best local peaks with little chance of being optimal. In addition, what I think of as optimal emotional responses are likely to be too resistant to exceptions. E.g., if one is trapped in a mine shaft the emotional response suitable for typical cases of being trapped is likely to consume too much oxygen.
I’m generally open to ideas regarding what my emotions should be in different situations, and how I can act to change my emotions.
I’m not sure what this means. Could you elaborate?
What I imagine you to mean seems similar to the sentiment expressed in the first comment to this blog post. That comment seems to me to be so horrifically misguided that I had a strong physiological response to reading it. Basically the commenter thought that since he doesn’t experience himself as following rules of formulating thoughts and sentences, he doesn’t follow them. This is a confusion of the map and territory that stuck in my memory for some reason, and your comment reminded me of it because you seem to be expressing a very strong faith in the accuracy of how things seem to you.
Feel free to just explain yourself without feeling obligated to read a random blog post or telling me how I am misreading you, which would be a side issue.
I think my response to lukeprog above answers this in a way, but it’s more just a question of what we mean by “help me decide.” I’m not against people helping me be less wrong about the actual content of the territory. I’m just against people helping me decide how to emotionally respond to it, provided we are both already not wrong about the territory itself.
If I am happy because I have plenty of food (in the map), but I actually don’t (in the territory), I’d certainly like to be informed of that. It’s just that I can handle the transition from happy to “oh shit!” all by myself, thank you very much.
In other words, my suspicion of anyone calling themselves an Empathetic Metaethicist is that they’re going to try to slide in their own approved brand of ethics through the back door. This is also a worry I have about CEV. Hopefully future posts will alleviate this concern.
If you mean that in service of my goal of satisfying my actual desires, there is more of a danger of being misled when getting input from others as to whether my emotions are a good match for reality than when getting input as to whether reality matches my perception of it, I tentatively agree.
If you mean that getting input from others as to whether my emotions are a good match for reality has a greater cost than benefit, I disagree assuming basic advice filters similar to those used when getting input as to whether reality matches my perception of it. As per above, there will all else equal be a lower expected payoff for me getting advice in this area, even though the advantages are similar.
If you mean that there is a fundamental difference in kind between matching perception to reality and emotions to perceptions that makes getting input an act that is beneficial in the former case and corrosive in the latter, I disagree.
I have low confidence regarding what emotions are most appropriate for various crises and non-crises, and suspect what I think of as ideal are at best local peaks with little chance of being optimal. In addition, what I think of as optimal emotional responses are likely to be too resistant to exceptions. E.g., if one is trapped in a mine shaft the emotional response suitable for typical cases of being trapped is likely to consume too much oxygen.
I’m generally open to ideas regarding what my emotions should be in different situations, and how I can act to change my emotions.