Instrumental rationality: achieving your values. Not necessarily “your values” in the sense of being selfish values or unshared values: “your values” means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. On LW we sometimes refer to this as “winning”.
In my opinion, Wikipedia puts things much better here:
Rationality is a central principle in artificial intelligence, where a rational agent is specifically defined as an agent which always chooses the action which maximises its expected performance, given all of the knowledge it currently possesses.
The advantage wikipedia has is that it is talking about expected performance on the basis of the available information, not about actual performance. That emphasis is correct—rationality is (or should be) defined in terms of whether operations performed on the available information constitute correct use of the tools of induction and deduction—and should not depend on whether the information the agent has is accurate or useful.
This has been discussed many times: there is a distinction between trying to win and winning.
If you are allowed to look at statements in a way that varies their meaning to the opposite, you may as well close your eyes. Justified means being supported by a powerful truth-engine, not being accompanied by a believed rationalization. If “from my point of view”, it is correct to expect to safely fly when I step out the window, it doesn’t make it correct, this expectation won’t be justified in the normal use of the word.
Are you not getting the point? Agents can correctly apply inductive and deductive reasoning, but draw the wrong conclusion—because of their priors, or because of misleading sensory data. Rationality is about reasoning correctly. It is possible to reason correctly and yet still do badly—for example if a hostile agent has manipulated your sense data without giving you a clue about what has happened. Maybe you could have done better by behaving “irrationally”. However, if you had no way of knowing that, the behaviour that led to the poor outcome could still be rational.
I absolutely agree with this point. Rationality in this sense is that truth-engine I named in the comment you replied to: it’s built for a range of possible environments, but can fail in case of an unfortunate happenstance. As opposed to having an insane maintainer who is convinced that the engine works when in fact it doesn’t, not just on the actual test runs, but on the range of possible environments for which it’s supposedly built. When you are 90% sure that something will happen, you expect it NOT to happen 1 time in 10.
“If “from my point of view”, it is correct to expect to safely fly when I step out the window, it doesn’t make it correct, ”
Yeah, but your “point of view” doesn’t include any stupid belief you have. If you could explicitly justify why you expected to fly when you stepped out that window, and trace that justification all the way back to elementary logic and fundamental observations, it would be totally rational for you to expect that.
It wouldn’t be your fault if the “rules” suddenly changed so that you fell, instead.
In my opinion, Wikipedia puts things much better here:
http://en.wikipedia.org/wiki/Rationality
The advantage wikipedia has is that it is talking about expected performance on the basis of the available information, not about actual performance. That emphasis is correct—rationality is (or should be) defined in terms of whether operations performed on the available information constitute correct use of the tools of induction and deduction—and should not depend on whether the information the agent has is accurate or useful.
This has been discussed many times: there is a distinction between trying to win and winning.
Exactly. Rationality is a property of our understanding of our thinking, not the thinking itself.
Being rational doesn’t involve choosing correctly, it’s about having a justified expectation that the choices you’re making are correct.
Well, if the expectation is justified, you are choosing correctly.
Depends on how you look at it.
If the expectation is justified, then the choice is correct from your point of view. But it can easily be wrong in an absolute sense.
If you are allowed to look at statements in a way that varies their meaning to the opposite, you may as well close your eyes. Justified means being supported by a powerful truth-engine, not being accompanied by a believed rationalization. If “from my point of view”, it is correct to expect to safely fly when I step out the window, it doesn’t make it correct, this expectation won’t be justified in the normal use of the word.
Are you not getting the point? Agents can correctly apply inductive and deductive reasoning, but draw the wrong conclusion—because of their priors, or because of misleading sensory data. Rationality is about reasoning correctly. It is possible to reason correctly and yet still do badly—for example if a hostile agent has manipulated your sense data without giving you a clue about what has happened. Maybe you could have done better by behaving “irrationally”. However, if you had no way of knowing that, the behaviour that led to the poor outcome could still be rational.
Good point Tim, rational doesn’t mean right.
Garbage in, Garbage out.
I absolutely agree with this point. Rationality in this sense is that truth-engine I named in the comment you replied to: it’s built for a range of possible environments, but can fail in case of an unfortunate happenstance. As opposed to having an insane maintainer who is convinced that the engine works when in fact it doesn’t, not just on the actual test runs, but on the range of possible environments for which it’s supposedly built. When you are 90% sure that something will happen, you expect it NOT to happen 1 time in 10.
“If “from my point of view”, it is correct to expect to safely fly when I step out the window, it doesn’t make it correct, ”
Yeah, but your “point of view” doesn’t include any stupid belief you have. If you could explicitly justify why you expected to fly when you stepped out that window, and trace that justification all the way back to elementary logic and fundamental observations, it would be totally rational for you to expect that.
It wouldn’t be your fault if the “rules” suddenly changed so that you fell, instead.