Why is there such a large gap of exploration into emotions on Lesswrong. Is it because they are colloquially the anathema to rationality?
I don’t think that’s accurate. In fact, Eliezer says as much in Why Truth?. He explicitly calls out the view that rationality and emotion are opposed, using the example of the character of Mr. Spock in Star Trek to illustrate his point. In his view, Mr. Spock is irrational, just like Captain Kirk, because denying the reality of emotions is just as foolish as giving in wholeheartedly to them. If your emotions rest on true beliefs, then they are rational. If they rest on false beliefs they are irrational. The fact that they are instinctive emotions rather than reasoned logic is irrelevant to their (ir)rationality.
I think LessWrong has actually done a fairly good job at avoiding this mistake. If we look at the posts on circling [1], [2], for example, you’ll see that they’re all about emotions and management of emotions. The same applies to Comfort Zone Expansion, ugh fields, meditation and Looking, and kenshō. It’s just that few of them actually mention the word “emotion” in their titles, which might lead one to the false assumption that they are not about emotions.
Interesting. I’ve seen this argument in other areas and I believe this is a step in the right direction. However there’s a gap between how belief is encoded and updated.
I do like Eliezer’s formulation of rationality. The nuance is that emotions are actually the result of a learning system that is according to Karl Friston’s free-energy principle, optimal in its ability to deviate from high entropic states.
I don’t think that’s accurate. In fact, Eliezer says as much in Why Truth?. He explicitly calls out the view that rationality and emotion are opposed, using the example of the character of Mr. Spock in Star Trek to illustrate his point. In his view, Mr. Spock is irrational, just like Captain Kirk, because denying the reality of emotions is just as foolish as giving in wholeheartedly to them. If your emotions rest on true beliefs, then they are rational. If they rest on false beliefs they are irrational. The fact that they are instinctive emotions rather than reasoned logic is irrelevant to their (ir)rationality.
I think LessWrong has actually done a fairly good job at avoiding this mistake. If we look at the posts on circling [1], [2], for example, you’ll see that they’re all about emotions and management of emotions. The same applies to Comfort Zone Expansion, ugh fields, meditation and Looking, and kenshō. It’s just that few of them actually mention the word “emotion” in their titles, which might lead one to the false assumption that they are not about emotions.
Also, see the Emotions tag. So even if you just directly search for the term, you will find much more than just 5 results.
There’s also Alicorn’s sequence on luminosity, which explicitly deals with emotions despite (apparently) not being tagged as such: https://www.lesswrong.com/s/ynMFrq9K5iNMfSZNg
Also Nate’s Replacing Guilt sequence. I’m still reading it, but I predict it’ll be the single most important sequence to me.
Interesting. I’ve seen this argument in other areas and I believe this is a step in the right direction. However there’s a gap between how belief is encoded and updated.
I do like Eliezer’s formulation of rationality. The nuance is that emotions are actually the result of a learning system that is according to Karl Friston’s free-energy principle, optimal in its ability to deviate from high entropic states.