I think Rationality itself is also a metric that fits this pattern (similarly to enlightenment). Taking into account that becoming more rational isn’t free, and might actually have a substantial cost, I’m pretty sure for many people it’s not worth it to invest in becoming more rational. I feel there’s a full post to be written here, but I don’t yet have the clarity to write it.
I actually just started drafting a post in this vein. I’m framing the question as “what are the relative advantages and disadvantages of explicit rationality?”. It’s a natural follow-up to problems we don’t understand: absent practice in rationality and being agenty (whether we call it that or not), we’ll most likely end up as cultural-adaptation-executors. That works well mainly for problems where cultural/economic selection pressures have already produced good strategies. Explicit rationality is potentially useful mainly when that’s not the case—either because some change has messed up evolved strategies, or because selection pressures are misaligned with what we want, or because the cultural/economic search mechanisms were insufficient to find good strategies in the first place.
Regarding “problems we don’t understand”, you pointed out an important meta-systematic skill: figuring out when different systems apply and don’t apply (by applying new systems learned to a list of 20 or so big problems).
The new post you’re eluding to sounds interesting, but rationality is a loaded term. Do you have specific skills of rationality in mind for that post?
Do you have specific skills of rationality in mind for that post?
No, which is part of the point. I do intend to start from the sequences-esque notion of the term (i.e. “rationality is systematized winning”), but I don’t necessarily intend to point to the kinds of things LW-style “rationality” currently focuses on. Indeed, there are some things LW-style “rationality” currently focuses on which I do not think are particularly useful for systematized winning, or are at least overemphasized.
Oh, I mean that part of the point of the post is to talk about what relative advantages/disadvantages rationality should have, in principle, if we’re doing it right—as opposed to whatever specific skills or strategies today’s rationalist community happens to have stumbled on. It’s about the relative advantages of the rationality practices which we hopefully converge to in the long run, not necessarily the rationality practices we have today.
Listing pros and cons of current rationalist techniques could then be compared to your ideal version of rationality to see what’s lacking (or points out holes in the “ideal version”). Also, “current rationality techniques” is ill-defined in my head and the closest I can imagine is the CFAR manual, though that is not the list I would’ve made.
Nice! Looking forward to reading your post. I wrote a few notes myself under the title “Should You Become Rational”*, but it turn into enough for a post. One of the things that I wanted to consider is whether its someone’s duty to become more rational, which I think is an interesting question (and it’s a topic that was discussed on LW, see Your Rationality is My Business). My current conclusion is that your obligation to become more rational is relative to how much influence you have to wish to have on the world and on other people. Of course, even if true, this point might be slightly moot, since only someone who is already interested in rationality might agree with it, others are unlikely to care.
* “Rational” pretty much for lack of a better word that still kept it short, didn’t want to use rationalist as that’s an identification as part of a specific group, which isn’t the point
“Rationality” was a vague metric for me when I first started reading the sequences. Breaking it down into clear skills (taking ideas seriously, noticing confusion, “truth” as predictive accuracy, etc) with explicit benefits and common pitfalls would be useful.
Once you nail down what metrics you’re talking about when you say “rationality”, I believe the costs and benefits of investing in becoming more rational will be clearer.
Feel free to brainstorm as replies to this comment, I would enjoy a full post on the subject.
I think Rationality itself is also a metric that fits this pattern (similarly to enlightenment). Taking into account that becoming more rational isn’t free, and might actually have a substantial cost, I’m pretty sure for many people it’s not worth it to invest in becoming more rational. I feel there’s a full post to be written here, but I don’t yet have the clarity to write it.
I actually just started drafting a post in this vein. I’m framing the question as “what are the relative advantages and disadvantages of explicit rationality?”. It’s a natural follow-up to problems we don’t understand: absent practice in rationality and being agenty (whether we call it that or not), we’ll most likely end up as cultural-adaptation-executors. That works well mainly for problems where cultural/economic selection pressures have already produced good strategies. Explicit rationality is potentially useful mainly when that’s not the case—either because some change has messed up evolved strategies, or because selection pressures are misaligned with what we want, or because the cultural/economic search mechanisms were insufficient to find good strategies in the first place.
Regarding “problems we don’t understand”, you pointed out an important meta-systematic skill: figuring out when different systems apply and don’t apply (by applying new systems learned to a list of 20 or so big problems).
The new post you’re eluding to sounds interesting, but rationality is a loaded term. Do you have specific skills of rationality in mind for that post?
No, which is part of the point. I do intend to start from the sequences-esque notion of the term (i.e. “rationality is systematized winning”), but I don’t necessarily intend to point to the kinds of things LW-style “rationality” currently focuses on. Indeed, there are some things LW-style “rationality” currently focuses on which I do not think are particularly useful for systematized winning, or are at least overemphasized.
I don’t know what point you’re referring to here. Do you mean that listing specific skills of rationality is bad for systematized winning?
I also want to wrangle more specifics from you, but I can just wait for your post:)
Oh, I mean that part of the point of the post is to talk about what relative advantages/disadvantages rationality should have, in principle, if we’re doing it right—as opposed to whatever specific skills or strategies today’s rationalist community happens to have stumbled on. It’s about the relative advantages of the rationality practices which we hopefully converge to in the long run, not necessarily the rationality practices we have today.
Oh! That makes sense as a post on it’s own.
Listing pros and cons of current rationalist techniques could then be compared to your ideal version of rationality to see what’s lacking (or points out holes in the “ideal version”). Also, “current rationality techniques” is ill-defined in my head and the closest I can imagine is the CFAR manual, though that is not the list I would’ve made.
Yup, exactly.
Nice! Looking forward to reading your post. I wrote a few notes myself under the title “Should You Become Rational”*, but it turn into enough for a post. One of the things that I wanted to consider is whether its someone’s duty to become more rational, which I think is an interesting question (and it’s a topic that was discussed on LW, see Your Rationality is My Business). My current conclusion is that your obligation to become more rational is relative to how much influence you have to wish to have on the world and on other people. Of course, even if true, this point might be slightly moot, since only someone who is already interested in rationality might agree with it, others are unlikely to care.
* “Rational” pretty much for lack of a better word that still kept it short, didn’t want to use rationalist as that’s an identification as part of a specific group, which isn’t the point
“Rationality” was a vague metric for me when I first started reading the sequences. Breaking it down into clear skills (taking ideas seriously, noticing confusion, “truth” as predictive accuracy, etc) with explicit benefits and common pitfalls would be useful.
Once you nail down what metrics you’re talking about when you say “rationality”, I believe the costs and benefits of investing in becoming more rational will be clearer.
Feel free to brainstorm as replies to this comment, I would enjoy a full post on the subject.