What if someone has rational reasons for rejecting a belief such as cryonics, but is deliberately using Dark Art rhetoric to talk more convincingly about that belief by associating it with low-status people? You’d class them as irrational when you should class them as unethical.
What if someone has rational reasons for rejecting a belief such as cryonics, but is deliberately using Dark Art rhetoric to talk more convincingly about that belief by associating it with low-status people? You’d class them as irrational when you should class them as unethical.
They would be classed as irrational based on the belief that they do not, in fact, have ‘rational reasons’ for their decision. If that belief is false then it is false—just like any other. The specifics of their Dark Arts rhetoric gives some evidence regarding both their ethics and their beliefs but only a small amount.
Both humbleness and arrogance are rationalist sins, but arrogance is worse. I can think of at least 4 different things wrong with this level of self-assurance in this context, and none of them have anything to do with cryonics in particular.
I’m not saying that there’s anything wrong with assigning a high probability to cryonics working. I’m saying that mentally dinging others for not agreeing with you is likely to leave your assessment of both them and of cryonics worse overall than if you didn’t.
Both humbleness and arrogance are rationalist sins, but arrogance is worse. I can think of at least 4 different things wrong with this level of self-assurance in this context, and none of them have anything to do with cryonics in particular.
No level of self assurance is specified or implied by me. My reply to was your complaint regarding the qualitative nature of the judgement—that is, that the dishonest people are being judged as irrational rather than unethical when their irrational seeming arguments are in fact not sincere.
I’m saying that mentally dinging others for not agreeing with you is likely to leave your assessment of both them and of cryonics worse overall than if you didn’t.
I don’t know much about ‘dinging’, but the simple act of disagreeing is already an act of disrespect. Further, when trying to seek out people whose opinions can provide strong evidence to you it is necessary to apply discretion. Whether that means “negative dings” to people whose beliefs are, to the best of your ability to judge, misguided or the assignment of “positive dings” to other people who execute behaviors that you judge as superior, in the end you need to judge the thoughts of others if you hope to improve your own.
“IIf that belief is false then it is false—just like any other.” I read this statement as an arrogant assumption that a sufficiently rational person could expect to correctly judge the probability of cryonics working with enough certainty that disagreeing with them would be “just like” disagreeing with reality. (Yes, I realize that this is not what you were directly saying with the words “just like”, but as far as I can see it is in fact implied.)
Of course you judge thought processes in order to improve your own. And it’s even rational to judge people by individual examples of their thought processes, though the human tendency is to overdo, not underdo, such generalized judging of people.
But any belief about cryonics (and similar areas) is of necessity based on a relatively long chain of inference from any possible direct evidence. There are much richer ways to assess the quality of that chain than by whether it reaches the same conclusion as your own.
They would be classed as irrational based on the belief that they do not, in fact, have ‘rational reasons’ for their decision. If that belief is false then it is false—just like any other.
I read this statement as an arrogant assumption that a sufficiently rational person could expect to correctly judge the probability of cryonics working with enough certainty that disagreeing with them would be “just like” disagreeing with reality.
I read “that belief” as referring to “the belief that they do not, in fact, have ‘rational reasons’ for their decision”. “Just like any other”, then, probably refers to the fact that many rational beliefs turn out to be incorrect (rather, frequently the optimally epistemically rational degree of credence for a proposition is further from the mark than a degree of credence selected by an alternate method, or is somewhat high despite the belief actually being incorrect, though on average rational degrees of belief are more accurate).
What if someone has rational reasons for rejecting a belief such as cryonics, but is deliberately using Dark Art rhetoric to talk more convincingly about that belief by associating it with low-status people? You’d class them as irrational when you should class them as unethical.
In this case, it might be (epistemically) correct to class them as irrational (with some probability, etc.), given the information you have about them.
Similarly, if someone draws a card at random from a standard 52-card deck, your degree of credence that it is the seven of diamonds should be 1⁄52 - it wouldn’t be correct to be more confident than that, even if in actuality it IS the seven of diamonds, as this is information you do not have access to.
(ETA: I’m speaking abstractly here—making no comment on rational beliefs about cryonics.)
What if someone has rational reasons for rejecting a belief such as cryonics, but is deliberately using Dark Art rhetoric to talk more convincingly about that belief by associating it with low-status people? You’d class them as irrational when you should class them as unethical.
They would be classed as irrational based on the belief that they do not, in fact, have ‘rational reasons’ for their decision. If that belief is false then it is false—just like any other. The specifics of their Dark Arts rhetoric gives some evidence regarding both their ethics and their beliefs but only a small amount.
Both humbleness and arrogance are rationalist sins, but arrogance is worse. I can think of at least 4 different things wrong with this level of self-assurance in this context, and none of them have anything to do with cryonics in particular.
I’m not saying that there’s anything wrong with assigning a high probability to cryonics working. I’m saying that mentally dinging others for not agreeing with you is likely to leave your assessment of both them and of cryonics worse overall than if you didn’t.
No level of self assurance is specified or implied by me. My reply to was your complaint regarding the qualitative nature of the judgement—that is, that the dishonest people are being judged as irrational rather than unethical when their irrational seeming arguments are in fact not sincere.
I don’t know much about ‘dinging’, but the simple act of disagreeing is already an act of disrespect. Further, when trying to seek out people whose opinions can provide strong evidence to you it is necessary to apply discretion. Whether that means “negative dings” to people whose beliefs are, to the best of your ability to judge, misguided or the assignment of “positive dings” to other people who execute behaviors that you judge as superior, in the end you need to judge the thoughts of others if you hope to improve your own.
“IIf that belief is false then it is false—just like any other.” I read this statement as an arrogant assumption that a sufficiently rational person could expect to correctly judge the probability of cryonics working with enough certainty that disagreeing with them would be “just like” disagreeing with reality. (Yes, I realize that this is not what you were directly saying with the words “just like”, but as far as I can see it is in fact implied.)
Of course you judge thought processes in order to improve your own. And it’s even rational to judge people by individual examples of their thought processes, though the human tendency is to overdo, not underdo, such generalized judging of people.
But any belief about cryonics (and similar areas) is of necessity based on a relatively long chain of inference from any possible direct evidence. There are much richer ways to assess the quality of that chain than by whether it reaches the same conclusion as your own.
I read “that belief” as referring to “the belief that they do not, in fact, have ‘rational reasons’ for their decision”. “Just like any other”, then, probably refers to the fact that many rational beliefs turn out to be incorrect (rather, frequently the optimally epistemically rational degree of credence for a proposition is further from the mark than a degree of credence selected by an alternate method, or is somewhat high despite the belief actually being incorrect, though on average rational degrees of belief are more accurate).
In other words, this is not entirely correct:
In this case, it might be (epistemically) correct to class them as irrational (with some probability, etc.), given the information you have about them.
Similarly, if someone draws a card at random from a standard 52-card deck, your degree of credence that it is the seven of diamonds should be 1⁄52 - it wouldn’t be correct to be more confident than that, even if in actuality it IS the seven of diamonds, as this is information you do not have access to.
(ETA: I’m speaking abstractly here—making no comment on rational beliefs about cryonics.)