But this is one of my issues with what I have seen at lesswrong—the privileging of predictive utility over other forms of epistemic rationality over instrumental rationality. Epistemic rationality is another form of instrumental rationality, but where rationalists gather, it gets privileged as if it were the only true rationality, or at least a better rationality. It’s a mistake, and really impairs the ability of rationalists to understand other people who do not privilege epsitemic rationality to the same degree, if at all.
You say improper is not used in a pejorative sense, but clearly the normal usage of “improper” is pejorative. And when an epistemic utitlity competes with another instrumental utility, why doesn’t that equally make the epistemic utility improper?
Further, the non-epistemic beliefs are described as
but the other forms are arguably “not belief at all”.
TIme and time again, epistemic rationality is set up as the real, better, higher, truer, shinier rationality.
Just to be clear, I’m not here to trash the idea here. I came to the site from reading EY’s Harry Potter fan fiction, which is just awesome and I’ve dying for the next chapter. Between the book, and the sequences, I’m busy reading a guy making all my arguments and more, reading many of the key books I read years ago in graduate school. Korzybski and Jaynes are at the top of my pantheon (with Stirner, who I don’t see a lot of influence from). So I’m here because of some very specific and fundamental shared methodology.
I don’t say “me too” to all that I agree with, unless it is something new to me or I have a refinement to add. But on this point, I see privileging of epistemic rationality, and I think it’s a mistake.
You would put instrumental rationality above epistemic rationality?
So if it makes me happy to believe the Moon is made of cheese, I ought to do so?
If making yourself happy is, all things considered, what you want to do. (And then assuming that said belief modification is the most effective way to gain happiness.)
It’s a peculiar article, because it gives two concepts as a definition for rationality, Epistemic Rationality and Instrumental Rationality, where clearly the concepts are not identical. And yet all sorts of statements are made thereafter about Rationality without noting the difference between the two concepts.
To answer you question in these terms, for all beliefs where the Instrumentally Rational belief is X, and the Epistemically Rational belief is NOT X, I’d rather believe in X. I’d rather Win than Correctly Predict, where I have to make the tradeoff.
But this is one of my issues with what I have seen at lesswrong—the privileging of predictive utility over other forms of epistemic rationality over instrumental rationality. Epistemic rationality is another form of instrumental rationality, but where rationalists gather, it gets privileged as if it were the only true rationality, or at least a better rationality. It’s a mistake, and really impairs the ability of rationalists to understand other people who do not privilege epsitemic rationality to the same degree, if at all.
You say improper is not used in a pejorative sense, but clearly the normal usage of “improper” is pejorative. And when an epistemic utitlity competes with another instrumental utility, why doesn’t that equally make the epistemic utility improper?
Further, the non-epistemic beliefs are described as
TIme and time again, epistemic rationality is set up as the real, better, higher, truer, shinier rationality.
Just to be clear, I’m not here to trash the idea here. I came to the site from reading EY’s Harry Potter fan fiction, which is just awesome and I’ve dying for the next chapter. Between the book, and the sequences, I’m busy reading a guy making all my arguments and more, reading many of the key books I read years ago in graduate school. Korzybski and Jaynes are at the top of my pantheon (with Stirner, who I don’t see a lot of influence from). So I’m here because of some very specific and fundamental shared methodology.
I don’t say “me too” to all that I agree with, unless it is something new to me or I have a refinement to add. But on this point, I see privileging of epistemic rationality, and I think it’s a mistake.
You would put instrumental rationality above epistemic rationality?
So if it makes me happy to believe the Moon is made of cheese, I ought to do so?
If making yourself happy is, all things considered, what you want to do. (And then assuming that said belief modification is the most effective way to gain happiness.)
I put winning above predictive accuracy, yes.
As fate would have it, the article What do We Mean by Rationality is the page that comes up in my chrome browser when I type “less” http://lesswrong.com/lw/31/what_do_we_mean_by_rationality/
It’s a peculiar article, because it gives two concepts as a definition for rationality, Epistemic Rationality and Instrumental Rationality, where clearly the concepts are not identical. And yet all sorts of statements are made thereafter about Rationality without noting the difference between the two concepts.
To answer you question in these terms, for all beliefs where the Instrumentally Rational belief is X, and the Epistemically Rational belief is NOT X, I’d rather believe in X. I’d rather Win than Correctly Predict, where I have to make the tradeoff.