“Predictable” and “changeable” have limits, but people generally don’t know where those limits are. What looks like bad luck to one person might look like the probable consequences of taking stupid chances to another.
Or what looks like a good strategy for making an improvement to one person might looking like knocking one’s head against a wall to another.
The point you and Eliezer (and possibly Vaniver) seem to be making is that “perfectly rational agents are allowed to get unlucky” isn’t a useful meme, either because we tend to misjudge which things are out of our control or because it’s just not useful to pay any attention to those things.
Is that a fair summary? And, if so, can you think of a better way to express the point I was making earlier about conceptually distinguishing rational conduct from conduct that happens to be optimal?
ETA: Would “rationality doesn’t require omnipotence” suit you better?
“Predictable” and “changeable” have limits, but people generally don’t know where those limits are. What looks like bad luck to one person might look like the probable consequences of taking stupid chances to another.
Or what looks like a good strategy for making an improvement to one person might looking like knocking one’s head against a wall to another.
The point you and Eliezer (and possibly Vaniver) seem to be making is that “perfectly rational agents are allowed to get unlucky” isn’t a useful meme, either because we tend to misjudge which things are out of our control or because it’s just not useful to pay any attention to those things.
Is that a fair summary? And, if so, can you think of a better way to express the point I was making earlier about conceptually distinguishing rational conduct from conduct that happens to be optimal?
ETA: Would “rationality doesn’t require omnipotence” suit you better?