I do think there’s an alternate frame where you just say “no, rationality is specifically about being a robust agent. There are other ways to be effective, but rationality is the particular way of being effective where you try to have cognitive patterns with good epistemology and robust decision theory.”
This is in tension with the “rationalists should win”, thing. Shrug.
I think it’s important to have at least one concept that is “anyone with goals should ultimately be trying to solve them the best way possible”, and at least one concept that is “you might consider specifically studying cognitive patterns and policies and a cluster of related things, as a strategy to pursue particular goals.”
I don’t think is quite the same thing as instrumental rationality (although it’s tightly entwined). If your goals are simple and well-understood, and you’re interfacing in a social domain with clear rules, the most instrumentally rational thing might be to not overthink it and follow common wisdom.
But it’s particularly important if you want to coordinate with other agents, over the long term. Especially on ambitious, complicated projects in novel domains.
On my initial read, I read this as saying “this is the right thing for some people, even when it isn’t instrumentally rational” (?!). But
I think it’s important to have at least one concept that is “anyone with goals should ultimately be trying to solve them the best way possible”, and at least one concept that is “you might consider specifically studying cognitive patterns and policies and a cluster of related things, as a strategy to pursue particular goals.”
makes me think this isn’t what you meant. Maybe clarify the OP?
I was meaning to say “becoming a robust agent may be the instrumentally rational thing for some people in some situation. For other people in other situations, it may not be helpful.”
I don’t know that “instrumental rationality” is that well defined, and there might be some people who would claim that “instrumental rationality” and what I (here) am calling “being a robust agent” are the same thing. I disagree with that frame, but it’s at least a cogent frame.
You might define “instrumental rationality” as “doing whatever thing is best for you according to your values”, or you might use it it to mean “using an understanding of, say, probability theory and game theory and cognitive science to improve your decision making”. I think it makes more sense to define it the first way, but I think some people might disagree with that.
If you define it the second way, then for some people – at least, people who aren’t that smart or good at probability/game-theory/cog-science – then “the instrumentally rational thing” might not be “the best thing.”
I’m actually somewhat confused about which definition Eliezer intended. He has a few posts (and HPMOR commentary) arguing that “the rational thing” just means “the best thing”. But he also notes that it makes sense to use the word “rationality” specifically when we’re talking about understanding cognitive algorithms.
Not sure whether that helped. (Holding off on updating the post till I’ve figured out what the confusion here is)
I define it the first way, and don’t see the case for the second way. Analogously, for a while, Bayesian reasoning was our best guess of what the epistemic Way might look like. But then we find out about logical induction, and that seems to tell us a little more about what to do when you’re embedded. So, we now see it would have been a mistake to define “epistemic rationality” as “adhering to the dictates of probability theory as best as possible”.
I think that Eliezer’s other usage of “instrumental rationality” points to fields of study for theoretical underpinning of effective action.
(not sure if this was clear, but I don’t feel strongly about which definition to use, I just wanted to disambiguate between definitions people might have been using)
I think that Eliezer’s other usage of “instrumental rationality” points to fields of study for theoretical underpinning of effective action.
This sounds right-ish (i.e. this sounds like something he might have meant). When I said “use probability and game theory and stuff” I didn’t mean “be a slave to whatever tools we happen to use right now”, I meant sort of as examples of “things you might use if you were trying to base your decisions and actions off of sound theoretical underpinnings.”
So I guess the thing I’m still unclear on (people’s common usage of words): Do most LWers think it is reasonable to call something “instrumentally rational” if you just sorta went with your gut without ever doing any kind of reflection (assuming your gut turned out to be trustworthy?).
Or are things only instrumentally rational if you had theoretical underpinnings? (Your definition says “no”, which seems fine. But it might leave you with an awkward distinction between “instrumentally rational decisions” and “decisions rooted in instrumental rationality.”)
I’m still unsure if this is dissolving confusion, or if the original post still seems like it needs editing.
Your definition says “no”, which seems fine. But it might leave you with an awkward distinction between “instrumentally rational decisions” and “decisions rooted in instrumental rationality.”
My definition was the first, which is “instrumental rationality = acting so you winyour values”. So, wouldn’t it say that following your gut was instrumentally rational? At least, if it’s a great idea in expectation given what you knew—I wouldn’t say lottery winners were instrumentally rational.
I guess the hangup is in pinning down “when things are actually good ideas in expectation”, given that it’s harder to know that without either lots of experience or clear theoretical underpinnings.
I think one of the things I was aiming for with Being a Robust Agent is “you set up the longterm goal of having your policies and actions have knowably good outcomes, which locally might be a setback for how capable you are, but allows you to reliably achieve longer term goals.”
I do think there’s an alternate frame where you just say “no, rationality is specifically about being a robust agent. There are other ways to be effective, but rationality is the particular way of being effective where you try to have cognitive patterns with good epistemology and robust decision theory.”
This is in tension with the “rationalists should win”, thing. Shrug.
I think it’s important to have at least one concept that is “anyone with goals should ultimately be trying to solve them the best way possible”, and at least one concept that is “you might consider specifically studying cognitive patterns and policies and a cluster of related things, as a strategy to pursue particular goals.”
On my initial read, I read this as saying “this is the right thing for some people, even when it isn’t instrumentally rational” (?!). But
makes me think this isn’t what you meant. Maybe clarify the OP?
I was meaning to say “becoming a robust agent may be the instrumentally rational thing for some people in some situation. For other people in other situations, it may not be helpful.”
I don’t know that “instrumental rationality” is that well defined, and there might be some people who would claim that “instrumental rationality” and what I (here) am calling “being a robust agent” are the same thing. I disagree with that frame, but it’s at least a cogent frame.
You might define “instrumental rationality” as “doing whatever thing is best for you according to your values”, or you might use it it to mean “using an understanding of, say, probability theory and game theory and cognitive science to improve your decision making”. I think it makes more sense to define it the first way, but I think some people might disagree with that.
If you define it the second way, then for some people – at least, people who aren’t that smart or good at probability/game-theory/cog-science – then “the instrumentally rational thing” might not be “the best thing.”
I’m actually somewhat confused about which definition Eliezer intended. He has a few posts (and HPMOR commentary) arguing that “the rational thing” just means “the best thing”. But he also notes that it makes sense to use the word “rationality” specifically when we’re talking about understanding cognitive algorithms.
Not sure whether that helped. (Holding off on updating the post till I’ve figured out what the confusion here is)
I define it the first way, and don’t see the case for the second way. Analogously, for a while, Bayesian reasoning was our best guess of what the epistemic Way might look like. But then we find out about logical induction, and that seems to tell us a little more about what to do when you’re embedded. So, we now see it would have been a mistake to define “epistemic rationality” as “adhering to the dictates of probability theory as best as possible”.
I think that Eliezer’s other usage of “instrumental rationality” points to fields of study for theoretical underpinning of effective action.
(not sure if this was clear, but I don’t feel strongly about which definition to use, I just wanted to disambiguate between definitions people might have been using)
This sounds right-ish (i.e. this sounds like something he might have meant). When I said “use probability and game theory and stuff” I didn’t mean “be a slave to whatever tools we happen to use right now”, I meant sort of as examples of “things you might use if you were trying to base your decisions and actions off of sound theoretical underpinnings.”
So I guess the thing I’m still unclear on (people’s common usage of words): Do most LWers think it is reasonable to call something “instrumentally rational” if you just sorta went with your gut without ever doing any kind of reflection (assuming your gut turned out to be trustworthy?).
Or are things only instrumentally rational if you had theoretical underpinnings? (Your definition says “no”, which seems fine. But it might leave you with an awkward distinction between “instrumentally rational decisions” and “decisions rooted in instrumental rationality.”)
I’m still unsure if this is dissolving confusion, or if the original post still seems like it needs editing.
My definition was the first, which is “instrumental rationality = acting so you winyour values”. So, wouldn’t it say that following your gut was instrumentally rational? At least, if it’s a great idea in expectation given what you knew—I wouldn’t say lottery winners were instrumentally rational.
I guess the hangup is in pinning down “when things are actually good ideas in expectation”, given that it’s harder to know that without either lots of experience or clear theoretical underpinnings.
I think one of the things I was aiming for with Being a Robust Agent is “you set up the longterm goal of having your policies and actions have knowably good outcomes, which locally might be a setback for how capable you are, but allows you to reliably achieve longer term goals.”