If you ask me, the term “instrumental rationality” has been subject to inflation. It’s not supposed to mean better achieving your goals, it’s supposed to mean better achieving your goals by improving your decision algorithm itself, as opposed to by improving the knowledge, intelligence, skills, possessions, and other inputs that your decision algorithm works from. Where to draw the line is a matter of judgment but not therefore meaningless.
Agreed. Systematic instrumental rationality is what we’re interested in. Better general methods. Akrasia and the problem of internal conflicts, fits this template; but making better coffee does not, however useful you may find it.
Could anyone here recommend areas where one could attempt to discuss some of society’s more pressing issues using the very general methods described here? Politics and making better coffee?
While I agree such posts would not fit here, such discussions would serve as practice. If the community were similar to this one, ideally hard evidence and constructive criticism would be the norm.
Assuming promoted articles are subject to your veto, I don’t see much harm in original posts of exceptional quality, even if they are either overly meta-LW or overly domain specific. Of course, one must draw the line at pictures of kittens.
I’m unclear on why you’d think there’d be a bright line distinction between “reason” and “rationality”. They seem to in most cases be usable interchangably in ordinary language.
The combination of inductive reasoning and deductive reasoning seems like a natural category—which I think needs a name. You could call this “logical reasoning”—but “reasoning” seems better to me. The term covers both logical and illogical reasoning—though of course the latter sort is not of very much use. What is doesn’t cover is perception, goals, or experimental generate-and-test.
If it is out there, better terminology would be welcomed.
Upvoted for not being about gender.
If you ask me, the term “instrumental rationality” has been subject to inflation. It’s not supposed to mean better achieving your goals, it’s supposed to mean better achieving your goals by improving your decision algorithm itself, as opposed to by improving the knowledge, intelligence, skills, possessions, and other inputs that your decision algorithm works from. Where to draw the line is a matter of judgment but not therefore meaningless.
Agreed. Systematic instrumental rationality is what we’re interested in. Better general methods. Akrasia and the problem of internal conflicts, fits this template; but making better coffee does not, however useful you may find it.
This is indeed a deep rabbit hole.
Could anyone here recommend areas where one could attempt to discuss some of society’s more pressing issues using the very general methods described here? Politics and making better coffee?
While I agree such posts would not fit here, such discussions would serve as practice. If the community were similar to this one, ideally hard evidence and constructive criticism would be the norm.
Assuming promoted articles are subject to your veto, I don’t see much harm in original posts of exceptional quality, even if they are either overly meta-LW or overly domain specific. Of course, one must draw the line at pictures of kittens.
Are we really bad enough at voting that we can’t be trusted to downvote pictures of kittens?
I’m not certain that anyone can very reliably be trusted to downvote e.g. the failcat sequence.
Come on, if that doesn’t demonstrate a relevant failure of rationality I don’t know what does.
ETA: (insert standard convention for tagging this as an attempt at humor)
I strongly agree, and I’d like to add that I definitely see a place for this sort of instrumental rationality here.
Isn’t “reason” the best name for what you are talking about there?
http://en.wikipedia.org/wiki/Reason can be thought of as including induction and deduction—but not empirical generate-and-test procedures.
Is is a good idea to redefine “instrumental rationality” to mean the same as this existing term?
I’m unclear on why you’d think there’d be a bright line distinction between “reason” and “rationality”. They seem to in most cases be usable interchangably in ordinary language.
The combination of inductive reasoning and deductive reasoning seems like a natural category—which I think needs a name. You could call this “logical reasoning”—but “reasoning” seems better to me. The term covers both logical and illogical reasoning—though of course the latter sort is not of very much use. What is doesn’t cover is perception, goals, or experimental generate-and-test.
If it is out there, better terminology would be welcomed.
Well, Bayesian probability