Instrumental Rationality is a Chimera
Eliezer observes, “Among all self-identified “rationalist” communities that I know of, and Less Wrong in particular, there is an obvious gender imbalance—a male/female ratio tilted strongly toward males.” and provides us with a selection of hypotheses that attempt to explain this notable fact, ranging over the normal cultural and biological explanations for male/female imbalances in any community. One important point was missing however, a point raised by Yvain last week under the title, Extreme Rationality: It’s Not That Great. That fact is that we have not done anything yet. Eliezer writes under the assumption that women ought to want to study our writings, but since we have so far failed to produce a single practical application of our rationalist techniques, I really cannot blame women for staying away. They may be being more rational than we are.
Long have we pondered Eliezer’s enigmatic homily, “Rationalists should win.” and like the aristoteleans of old we agreed that it must be so, since a proclivity to win is inherent in the definition of the word “rationalist”.
Well, have you won anything lately? Are the horizons of your power expanding, you rationalist Übermenschen? Perhaps you will say, “We have only just gotten started! We are pregnant with potential, if not abounding with achievements.”
I do not mean to be impatient but it has been a few weeks now and we appear to be spinning our wheels a little tiny bit. As interesting as many of the posts here have been, I cannot recall any of them having been instrumentally useful to me, or anyone else here mentioning posts that have been instrumentally useful to them. In fact it almost seems as if most of the posts contributed by the Less Wrong community have been about the Less Wrong community. These self-referential meta-posts accumulate, and as they become increasingly impenetrable they discourage potential contributors of either sex.
Since the confusion caused by this notion of instrumental rationality shows no signs of abating, I will attempt to cut the knot. There is no such thing as instrumental rationality. What is the rational way to butter toast? Brew coffee? Drive a car? Raise a child? Conduct a particle physics experiment? You will notice that the unifying feature among these examples is that there is no unifying feature among these examples. Rationality – real world, day to day, nine to five rationality – is entirely context dependent. The attempt to develop a grand unified theory of instrumental rationality is an attempt to abstract away from the details of inidividual circumstances, in order to come up with a Best Way To Do Everything Forever. This is untenable. Rationality can be used to choose the best course of action for achieving a particular goal, but this is simply an example of knowing the truth – epistemic rationality.
I think that we have been on the wrong track, up until now. I believe we can do better, but first we must abandon the silly martial arts metaphors. You do not need academic-grade rationality every second of the day and you do not need to pretend that you are the only rational person in the world. Co-operate. In order to live rationally and live well, we must have easy access to organised expert domain knowledge in useful areas such as self-motivation, health and fitness, development of social skills, use of technology and of course, the abstract rules of epistemic rationality. I am sure there is much more that could be added to this list. To achieve this I suggest that, like an economy, we subdivide and specialise. Rather than racking their brains in an attempt to come up with something novel to say on the topic of abstract rationalism, we should encourage contributors to tell us about something they specialise in, to give us advice backed by evidence and reasoned argument about something they know a lot about, and to direct us to useful references wherein we may learn more. I imagine people contributing a guide to getting accurate medical information, tips on child psychology and raising children, or an essay on how to exercise to increase longevity.
Clearly, we have a group of interested, motivated, highly intelligent people here at Less Wrong, each of whom has their own particular talent, so why not make the most of them?
The rational way to do all of these things is to discover and do what it physically requires to achieve them. To be constrained by how the world is, as opposed, for example, to merely wishing for buttered toast, saying the name of God over the espresso machine, driving straight out into heavy traffic the first time you sit behind a steering wheel, beating a baby to make it stop crying, or neglecting the study of all the mathematics, physics, engineering, and management necessary to build and run the LHC.
I am not seeing the problem here. There is such an art, and OB and LW are about it.
I agree. We need war stories (if that’s not an off-puttingly masculine way of describing it) in addition to generalities. “The Art must have a purpose other than itself.”
Bodies creating software or hardware specifications sometimes make the following rule: no specification without implementation. Anyone proposing something for the spec must also demonstrate an implementation. The principle can be widely applied.
I don’t have a war story to add to this post though. Today, like most working days, I will apply myself to various mathematical and programming tasks at work, but I doubt anyone wants to hear about finite element modelling or GUI design on LW. In the evening I will do various other things not of consequence here. Where is rationality being applied? Well, where would it not be being applied?
Masculinity isn’t off-putting.
That’s a subjective value judgement from your point of view.
If you intend it to be more than that, you would have to explain why others shouldn’t see it as off-putting.
Otherwise, I don’t see how it contributes to the discussion other than “there’s at least one person out there who thinks masculinity isn’t off putting”, which we already know, there’s billions of examples.
I love how this is the hill we’re dying on.
No bodies yet, in fact I would consider this preventive maintenance...
Using a very informal definition of rationality as the opposite of irrationality, is there an irrational way to butter toast? If so, does it not imply a rational way to butter toast?
There is a rational way to butter toast but it has nothing to do with the rational way to conduct particle physics experiments.
“Nothing to do” is a bit unapproachable. The rational way to butter toast is related to the rational way to conduct particle physics in really mindless links (such as that they both exist; they both use the term “rational way”). My gut reaction is that there is a better link between the two but have not thought about it in length.
I suppose the first path I would explore is that the same principles I use to discover irrationality can easily be applied in both circumstances. To use two things from your list I am familiar with, brewing coffee and driving a car certainly have parallels that can be abstracted so as to apply things to both activities.
Namely, to be rational in either I have to define what success and failure mean. Then a system of measuring success and failure needs to be determined. And yada yada. I can keep going but I think the point is simply this: “rationality” applies to both and learning how to be rational with a cup of coffee should help me become a rational driver.
This may be a bit too abstract to qualify a counter-point to the phrase “nothing to do with”.
But what is there in common? Inherently mysterious human mind?
Humans want buttered toast and knowledge of particle physics. Other than that I cannot think of anything else they have in common. If we are discussing beliefs about toast and particles then we are in the realm of epistemic rationality.
Agreed, definitely. While I’ve found that Eliezer’s posts, in general, have given me a general sense of thinking more clearly, but beyond enjoying that I haven’t been able to make much use. I’m reminded of Robin’s posts on marginal medical spending, which leave me thinking “OK, I’m convinced… what am I actually supposed to do about this?”
Heh, I and my wife recently did some experiments with our coffee consumption, which resulted in our switching from our long-preferred brand (“Davidoff”) to a cocktail made out of two brands.
The first brand (“Cubana”) is relatively expensive, poor-flavored, poor-tasting but well-caffeinated and the second one (“Jardin”) is relatively cheap, good-tasting, good-flavored but less caffeinated. We mix them 50⁄50 when we want more caffeine (rarely) or 30⁄70 when we want good balance of flavor, caffeine and taste (most of the time).
The reason we started exploring brands other than Davidoff was because its counterfeit percentage (as subjectively perceived by us) was above 33%, which is too high for a relatively expensive brand. We started exploring around and quickly learned than cheap brands almost always taste like crap, so we reallocated our coffee exploration budget to favor mid-range and expensive brands. The expensive brands usually failed to provide good price/performance ratio, so we settled on mid-range, where we finally found “Jardin” and “Cubana”.
To sum up, we minimized the cost, maximized taste, flavor and caffeine content, and made the scheme adaptable for our current needs (caffeine vs. taste).
I didn’t explicitly record P(brandX|goodtaste) and P(cheap|goodtaste), but I have a feeling that I’ll try that sometime on some similarly silly subject—just for the sheer fun of it :)
What’s the difference between flavor and taste?
Flavor can implicitly refer to intensity, whereas taste typically won’t. In a more pedantic sense, flavor also often includes scents as the complete experience (and possibly sensation of hot/cold/pain for certain chemicals, such as menthol or capsaicin), whereas taste properly refers only to the five essential tastes that the tongue can detect.
Warning: I am an enthusiastic amateur home cook.
Philip Greenspun hypothesizes something similar for the under-representation of women in hard sciences. Scott Aaronson disagrees.
(Of course, just because LW isn’t helping one with instrumental rationality doesn’t mean it’s irrational for one to read it; maybe one is reading it because it’s fun, and the fun is worth the time spent on it.)
I agree for the most part with Tom. Here’s a quote from an article that I drafted last night but couldn’t post due to my karma:
“I read comments fairly regularly that certainly imply that people are less successful or less fulfilled than they might be (I don’t want to directly link to any but I’m sure you can find them on any thread where people start to talk about their personal situation). Where are the posts that give people rational ways to improve their lives? It’s not that this is particularly difficult—there’s a huge psychological literature on various topics (for instance happiness, attraction and influence) that I’m sure people here have the expertise to disseminate. And it would have obvious applications in making people more successful and fulfilled in their day to day lives.
It seems to me that the Less Wrong community concentrates on x-rationality, which is a larger and more intellectually stimulating challenge (and, cynically, better for signalling intellectual prowess) at the expense of simple instrumental rationality. It’s as if we think that because we’re thinking about black belt x-rationality, we’re above applying blue belt instrumental rationality.
In my life I’m constantly learning new and more accurate models with which to understand the world that don’t come near to determining whether to one or two box in pure complexity terms. They are useful more often, though.
This isn’t to denigrate x-rationality. Obviously its important but it currently seems like there’s no balance on LW between that and instrumental rationality. As a side benefit I’ll bet good money that the best way to get people interested in rationality is to simply show them how successful you are when applying it—something that would be more possible with instrumental rationality than x-rationality.”
I disagree with Tom over the terminology though. I quite like the terms x-rationality and instrumental rationality because they allow me to easily talk about two broad types of rational thought even though i would be hard pressed to draw a specific line between them.
Why are there more men standing around doing nothing as opposed to women standing around doing nothing? I am not sure how your post addresses the question of the apparent gender imbalance.
I said something in the other thread along the lines of, “women are more likely to have friends and men are more likely to be nerdy loners” but yes, I was really just using the gender imbalance thing as a jumping off point.
Ah, okay. Thanks for the clarification.
If you’ve been reading this site, you’ve seen repeated objections, including my own, to this definition.
There is no such thing as non-instrumental rationality; for it would not be rational. It would be an arbitrarily-chosen value.
A belief that galaxies exist is a rational belief, but has no instrumental value. It is simply true.
It’s instrumental to the goal of understanding the universe. Moreover, if I didn’t believe that galaxies exist I doubt I would be getting paid to study them.
No, a belief that galaxies exist is rational given certain evidence.
Your claim about its instrumental value is simply wrong. Any knowledge is useful if it has consequences for something we wish to do, and the only way to know whether a given piece of knowledge has such consequences is to know it well.
I agree that we could use some more domain-specific essays. That said, I think you’re undervaluing the pure philosophy and cognitive psychology side of this site’s writings.
Less Wrong is only a blog. The only thing it can ever hope to achieve is to produce some good essays. The people who read those essays may go on to achieve things, but those only reflect on them personally, not on Less Wrong.
No, that is expressly disavowed by the site’s title. All progress is incremental; it isn’t possible to be completely right, only to be less wrong, or less often wrong.
Don’t you have a sense that more is possible?
Well alright, I was exaggerating for rhetorical purposes. But still, the point stands that “instrumental rationality” does not correspond to anything that anyone can actually do. It is a meaningless label.
Um, yes there is. In all cases, the rational way is the way that produces predictably suitable results relative to your desired level of utility and/or investment for that result. And in general, the procedures you use will be defined by testing according to some predefined criteria, after being generated through creative and/or problem-solving processes.
And that, more or less, is the skillset (or at least the rough scope of such a skillset) of instrumental rationality.
Agreed.
But if you don’t know how to hypothesize, measure, and test, you have no way to sort that domain knowledge for usefulness to you.
What are these, exactly?
I apologise, but is there some simpler way for you to express this? I do not understand what it means. If you stated it in terms of real-life thoughts, actions and problems I might grasp it better.
Again, is there some simpler way to express this? Who generated the procedures? Was it me or someone else? How are the procedures defined by the testing criteria? Surely a procedure is a sequence of steps toward some goal, and does not have a definition.
I remain hopelessly unenlightened.
Logical fallacies, bayes theorem, working knowledge of the scientific method and knowledge of heuristics and biases.
Concretely: I wish my toast to be buttered (let’s leave aside for now why I have this desire and if it is rational). Innate, largely unconscious instrumental human rationality (the process by which we inductively learn how to behave in the world) has already equipped me with a way to achieve my goal: apply butter to knife, apply buttered knife to toast. If this toast-buttering strategy achieves the desired outcome in a relatively short amount of time and to a satisfactory quality level then I am being perfectly rational in not pursuing the issue further. I’m achieving predictably (repeatable on many trials) suitable (meeting my goals satisfactorily) results relative to my desired level of utility (buttered toast in my belly is a small but detectable increase in utility) and/or investment for that result (I don’t want to invest too much time in trying to optimize my toast buttering due to my estimate of the cost/benefit of further time investment in so doing).
It is possible that the simple procedure above does not meet my goals. Perhaps I am frustrated by the fact that the cold butter from the fridge tends to destroy the surface of the toast and also fails to spread adequately, thus lowering the utility I gain from buttered toast consumption. Perhaps I work in a cafe serving English breakfast and I find myself spending a large amount of time buttering toast and desire a more efficient bulk toast-buttering technique. In either case the basic principles of rationality could be brought to bear to improve my results, if my estimate was that the time investment of attempting improvements would be justified by my expected increase in utility.
Thankyou, Matt. This takes us to the heart of the matter.
Isn’t this a completely ludicrous example of “rationality”? What does “rationality” signify in this case? Unconsious co-ordination and control of the senses and muscles? That is not what I, or any sane person, understands by the word “rationality”! Is “rationality” just a universal signifier for “doing stuff right”? The word has been stretched beyond all meaning! You may rationally believe that particular way of buttering the toast is the correct way, but this is an example of epistemic rationality. There is no need to invoke the non-concept of instrumental rationality.
I’ll be honest with you, I have never detected utility in my belly. I know this is tangenital, but the concept of utility just does not seem at all useful. How do we calibrate this unit of measurement? Or is it simply an abstraction? I can see how that might be useful in an abstract discussion, but in this case why not refer directly to the actual feelings felt? Saying that you “increased utility” is pointless jargon, and its use contributes to the kind of confused word-salad which occasionally appears in these sorts of discussions.
Here’s the crux of it. Your improvements will make use of the standard tools of epistemic rationality, the scientific method and all the rest of it. There is no seperate world of instrumental rationality. At best “instrumental rationality” may be defined as epistemic rationality applied to the problem of choosing among methods for achieving a goal, a rather weak and pointless category. The actual methods themselves are emphatically not a form of rationality.
No, instrumental rationality is the meta-process you apply to choosing or refining the primary process (i.e., the actual toast-buttering).
If you look carefully at the original statement that I made, you’ll find that there are a large number of places where people fail at instrumental rationality:
Failing to establish success criteria in advance
Failing to determine desired/feasible levels of investment
Failing to test
Failing to generate alternatives
Failure to apply creativity
Failure to apply problem-solving
And these are just the failures you can generate by a literal reading of my statement, without addressing things like failures within each of these areas, like failure to establish a baseline for testing, etc.
These are all ways in which I’ve seen large, expensive, real-world projects fail… and a lot of people in the business world will nonetheless look at you funny when you ask questions like, “so, how will this make the company money?”
(And a small minority, thank heaven, will think you’re a genius (or recognize a fellow-traveler) and start bringing you in to ask these kinds of questions sooner in the process.)
The rationality I was referring to wasn’t the rational control of the muscles. It was the rational belief that applying butter to knife and buttered knife to toast would result in buttered toast. I’m thinking of my favourite rebuke to the claim that “there are no atheists in a foxhole”: “Tell a devout Christian that his wife is cheating on him, or that frozen yogurt can make a man invisible, and he is likely to require as much evidence as any one else, and to be persuaded only to the extent that you give it.”.
Our ‘common sense’ knowledge of the world is grounded in a basic rationalism. How could it be otherwise, given that rationality is ‘that process of thinking that delivers correct answers’? It has been empirically demonstrated that our rationality is flawed and limited however. I understand the term ‘instrumental rationality’ to mean simply the conscious application of the principles or rationality to achieving better results in the mundane business of everyday existence. Understanding where innate unthinking rationality fails is perhaps the most important step to improving our outcomes.
A practical distinction that is suggested to me by the term ‘instrumental rationality’ is the emphasis on only being as rational as is justified by the circumstances. It would be irrational to perform a full cost benefit analysis on toast buttering given that my current technique achieves acceptable results with minimal effort.
Then I don’t think you’ve really understood the concept of utility. Rationality has had the most practical benefits for me when applied to achieving outcomes whose utility I can ‘feel in my belly’. I can’t perfectly calibrate a unit of measurement but by attempting to weigh up the relative utilities of different outcomes and let estimates of expected values of different choices influence my thinking I find I can usefully improve my decisions. That is really what I mean by ‘instrumental rationality’ - something less formal than full blown x-rationality but more conscious than the rationality which comes naturally and without thinking.
Informed by but not identical with. I find the distinction useful. There are many decisions I have to make that do not justify the investment of resources required to perform a rigorous analysis but that seem to me to benefit from an effort to informally apply principles of rationality that (I feel but cannot rigorously prove) make me less wrong than I would be if I did not make a conscious effort to apply them.
One of the key (informal, non-rigorous) insights for me from Bayes’ Theorem is that it is perfectly rational to make best guesses derived from many uncertain inputs. There is no need to be certain of any of your premises to make decisions that are still the best decision you can make given the context.
If one attempts to butter their toast by first rubbing the clean knife on the toast and then buttering the knife afterwards, I would not hesitate to describe this as an instrumentally irrational procedure for buttering toast.
It is likely that many people, faced with real-life tasks more complicated than buttering toast, do in fact apply methods abstractly resembling this alternate approach to toast-buttering.
He said that particular way of buttering toast was a rational way, not the rational way. There are other ways of buttering toast which may be rational, such as dipping it in liquefied butter or hiring a chef to do it for you.
Or perhaps you enjoy inventing complicated gadgets, in which case you might build an elaborate Rube Goldberg device to do it for you.
That’s the beauty of instrumental rationalism… we aren’t constrained by the silly, petty notion that there’s only ONE “correct” way to do something. ;-)
mattnewport has already answered the rest, so I’ll just fill in this bit. What I said was:
Meaning, “the procedures you use” (e.g. for toast-buttering) “will be defined” (i.e., determined, circumscribed, narrowed, filtered, specified) “by testing” (to determine suitability) “according to some predefined criteria” (i.e., your criteria for what a successful result would consist of) “after being generated” (i.e., first you generate procedures, then you test them), “through creative and/or problem-solving processes” (i.e., either you generate improvements on, alternatives to, or solutions for problems in an existing procedure, borrow a procedure from someone else, or attempt to invent or derive one from scratch.)