I’m not dismissing it, I’m endorsing it and agreeing with you that it has been my approach ever since my first post on LW.
I wasn’t talking about you; I was talking about SI’s approach in spreading and training rationality. You(SI) have Yudkowsky writing books, you have rationality minicamps, you have lesswrong, you and others are writing rationality articles and researching the rationality literature, and so on.
That kind of rationality training, research, and message looks poorly leveraged in achieving your goals, is what I’m saying. Poorly leveraged for anyone trying to achieve goals. And at its most abstract, that’s what rationality is, right? Achieving your goals.
So, I don’t care if your approach was to acquire as much relevant knowledge as possible before dabbling in debiasing, bayes, and whatnot (i.e., prioritizing the most leveraged approach). I wondering why your approach doesn’t seem to be SI’s approach. I’m wondering why SI doesn’t prioritize rationality training, research, and message by whatever is the most leveraged in achieving SI’s goals. I’m wondering why SI doesn’t spread the virtue of scholarship to the detriment of training debiasing and so on.
SI wants to raise the sanity waterline, is what the SI doing even near optimal for that? Knowing what SIers knew and trained for couldn’t even get them to see an opportunity for trading in on opportunity cost for years; that is sad.
I’m lost again; I don’t know what you’re saying.
I wasn’t talking about you; I was talking about SI’s approach in spreading and training rationality. You(SI) have Yudkowsky writing books, you have rationality minicamps, you have lesswrong, you and others are writing rationality articles and researching the rationality literature, and so on.
That kind of rationality training, research, and message looks poorly leveraged in achieving your goals, is what I’m saying. Poorly leveraged for anyone trying to achieve goals. And at its most abstract, that’s what rationality is, right? Achieving your goals.
So, I don’t care if your approach was to acquire as much relevant knowledge as possible before dabbling in debiasing, bayes, and whatnot (i.e., prioritizing the most leveraged approach). I wondering why your approach doesn’t seem to be SI’s approach. I’m wondering why SI doesn’t prioritize rationality training, research, and message by whatever is the most leveraged in achieving SI’s goals. I’m wondering why SI doesn’t spread the virtue of scholarship to the detriment of training debiasing and so on.
SI wants to raise the sanity waterline, is what the SI doing even near optimal for that? Knowing what SIers knew and trained for couldn’t even get them to see an opportunity for trading in on opportunity cost for years; that is sad.