This is not just world fallacy, in fact for specific values of “better” these are empirical statements.
No, the “just world fallacy” is a belief that the world always reaches morally “fair” outcomes. So “better” here has to mean that they deserve such outcomes in a moral sense. My guess is that many people here would reject these claims and find them quite objectionable, but it’s hard to deny that some followers of the Dark Enlightenment (albeit perhaps a minority) seem to be motivated by them. The just world fallacy (in addition to other biases, such as ingroup tribalism) provides one plausible explanation of this.
No, the “just world fallacy” is a belief that the world always reaches morally “fair” outcomes. So “better” here has to mean that they deserve such outcomes in a moral sense.
Ok, so which moral theory are we using to make that determination?
Someone who behaves more rationally is more likely to achieve his goals. Do you consider this a “fair” or “unfair” outcome?
No, the “just world fallacy” is a belief that the world always reaches morally “fair” outcomes. So “better” here has to mean that they deserve such outcomes in a moral sense. My guess is that many people here would reject these claims and find them quite objectionable, but it’s hard to deny that some followers of the Dark Enlightenment (albeit perhaps a minority) seem to be motivated by them. The just world fallacy (in addition to other biases, such as ingroup tribalism) provides one plausible explanation of this.
Ok, so which moral theory are we using to make that determination?
Someone who behaves more rationally is more likely to achieve his goals. Do you consider this a “fair” or “unfair” outcome?