What about cases where any rational course of action still leaves you on the losing side?
Although this may seem to be impossible according to your definition of rationality, I believe it’s possible to construct such a scenario because of the fundamental limitations of a human brains ability to simulate.
In previous posts you’ve said that, at worst, the rationalist can simply simulate the ‘irrational’ behaviour that is currently the winning strategy. I would contend that humans can’t simulate effectively enough for this to be an option. After all we know that several biases stem from our inability to effectively simulate our own future emotions, so to effectively simulate an entire other beings response to a complex situation would seem to be a task beyond the current human brain.
As a concrete example I might suggest the ability to lie. I believe it’s fairly well established that humans are not hugely effective liars and therefore the most effective way to lie is to truly believe the lie. Does this not strongly suggest that limitations of simulation mean that a rational course of action can still be beaten by an irrational one?
I’m not sure that even if this is true it should effect a universal definition of rationality—but it would place bounds on the effectiveness of rationality in beings of limited simulation capacity.
What about cases where any rational course of action still leaves you on the losing side?
Although this may seem to be impossible according to your definition of rationality, I believe it’s possible to construct such a scenario because of the fundamental limitations of a human brains ability to simulate.
In previous posts you’ve said that, at worst, the rationalist can simply simulate the ‘irrational’ behaviour that is currently the winning strategy. I would contend that humans can’t simulate effectively enough for this to be an option. After all we know that several biases stem from our inability to effectively simulate our own future emotions, so to effectively simulate an entire other beings response to a complex situation would seem to be a task beyond the current human brain.
As a concrete example I might suggest the ability to lie. I believe it’s fairly well established that humans are not hugely effective liars and therefore the most effective way to lie is to truly believe the lie. Does this not strongly suggest that limitations of simulation mean that a rational course of action can still be beaten by an irrational one?
I’m not sure that even if this is true it should effect a universal definition of rationality—but it would place bounds on the effectiveness of rationality in beings of limited simulation capacity.