I feel like Less Wrong isn’t a good place to turn to marketing tactics if your rational argument fails
So what do you turn to on Less Wrong, when your rational argument fails for no apparent rational reason? The dilemma runs like this:
I make my awesome rational argument for the expected benefit of doing X
Rationalists on LW decline to do X because of reasons that look not-very-rational to me
I try roundabout arguments to try and sneak past whatever bias is keeping people from getting with the program
At least, that’s how it looks if you accept the pro-cryonics arguments. Either way, the lack of consensus (and action) on cryonics seems indicative to me of a substantial failure of rationality somewhere… but I’m not sure where.
ETA: I am not advocating the use of “marketing tactics” here, I’m trying to raise the question of what to do when you think your rational argument is triggering something like an absolute denial macro in other rationalists for unknown reasons. “Reconsider if your argument is correct” is a valid response, as is “give up for now”.
Please don’t overuse the concept “absolute denial macro”. This refers to inability to notice your left arm is paralyzed. You’re talking about strictly ordinary denial.
I’m uncomfortable with the reification of an “absolute denial macro”, period. I think the observed behavior is more likely to be the product of a damage-induced cognitive deficit rather than some macro-like subroutine that gets activated inappropriately.
I abandon the rational argument and use other tactics to make them agree with me
I know down to my gut this is a bad idea.
If rationalists on LW are being irrational, let them grow to become better rationalists and try again. To act otherwise is to fall prey to the Dark Arts. Accepting a rational belief for irrational reasons means you can reject that belief for equally irrational reasons.
Accepting a rational belief for irrational reasons means you can reject that belief for equally irrational reasons.
Removing an irrational bias against doing something is not the same thing as adding an irrational reason for doing it. The per diem strategy is aimed at removing one of the perceived penalties of choosing cryonics, not adding another reason to do it.
Existing beliefs get the benefits of status-quo bias and confirmation bias. It takes work to remove them. So your statement above is nonsense.
It’s unlikely that you got your current so-called rational beliefs through rational methods. Like, win-the-lottery unlikely. Do you think that any of Eliezers’ posts are convincing purely for rational reasons? If so, you are engaging in self-delusion.
I know down to my gut this is a bad idea.
Which in this context, only increases the probability that you’re wrong, and for an utterly irrational reason.
Now, if you knew down to your gut that something was a good idea, and it made you happy to think so, I would be far less suspicious. For whatever reason, negative emotions seem to bias our reasoning processes much more than positive ones do.
The per diem strategy is aimed at removing one of the perceived penalties of choosing cryonics, not adding another reason to do it.
It can easily be read in the form of how cheap it is, rather than how expensive it isn’t.
Existing beliefs get the benefits of status-quo bias and confirmation bias. It takes work to remove them. So your statement above is nonsense.
I didn’t intend to mean reject that belief after accepting it. I meant that an irrational argument for a belief can be negated by an irrational argument against it. If you say the sky is blue because it is made of blue jeans, I can counter your argument by saying the sky is blue because it is made of paint. Obviously it doesn’t get us anywhere. Make a rational argument or don’t make an argument at all. If it helps, modify “Accepting” to “To be able to accept”.
It’s unlikely that you got your current so-called rational beliefs through rational methods. Like, win-the-lottery unlikely. Do you think that any of Eliezers’ posts are convincing purely for rational reasons? If so, you are engaging in self-delusion.
I’m not sure what you mean. I am certain I hold many beliefs that I was persuaded to accept by very convincing and skilled people, Eliezer among them. I think Eliezer writes very persuasively. Persuasive methods do not, however, imply the intended or even accidental employment of irrational methods when making a case. And if I thought I gained most or all of my “current so-called rational beliefs” through irrational methods, I would be very disturbed. The beliefs I am aware of currently adhering to were not put there by some motivational speaker but have been cultivated through study, reflection and revision.
Which in this context, only increases the probability that you’re wrong, and for an utterly irrational reason.
I thought my phrasing made it clear enough that it wasn’t just a gut feeling, but that my opposition to it was strong to the point where I actually had a gut feeling about it, not just an intellectual position. Instead of merely thinking “I know that’s wrong” I was thinking “How could that possibly be right.”
It can easily be read in the form of how cheap it is, rather than how expensive it isn’t.
This reminds me of an amusing segment of one of Frank Kern’s marketing classes, in which he explains that it doesn’t matter how cheap the product is if no one wants it. He then proceeds to illustrate this idea by joking about having a “new” device here that will cut off a chunk of a certain masculine body part, for “only” $1/day.
IOW, believing that cryonics is cheap is only relevant if you want it in the first place.
If you say the sky is blue because it is made of blue jeans, I can counter your argument by saying the sky is blue because it is made of paint.
Ridiculous! Everyone knows the sky is made of blue cheese.
(Seriously, your argument makes no more sense than that. There is no direct causal relationship between the argument that convinces you of a thing, and the argument that talks you back out of it or into something else.)
Persuasive methods do not, however, imply the intended or even accidental employment of irrational methods when making a case.
Perhaps “arational” would be a better word. The point is, when you are enjoying listening to a story, you are not engaging in reasoning, as it’s incompatible with experiencing the story.
And if I thought I gained most or all of my “current so-called rational beliefs” through irrational methods, I would be very disturbed.
Which is what keeps you motivated to not pay very close attention to what beliefs you actually have, and act on.
The beliefs I am aware of currently adhering to
...are a ridiculously tiny subset of the beliefs you actually hold and act upon, accumulated by observation, priming, and simple conditioning—none of which include any conscious involvement, awareness, study, or reflection, let alone reasoning.
And these beliefs—whether irrational or arational—form the framework through which you view the world and make judgments about it… including such judgments as to which reflected, conscious beliefs you should hold.
I thought my phrasing made it clear enough that it wasn’t just a gut feeling, but that my opposition to it was strong to the point where I actually had a gut feeling about it, not just an intellectual position. Instead of merely thinking “I know that’s wrong” I was thinking “How could that possibly be right.”
My point is that the feeling invariably precedes the logic, because it’s the feeling that told you to look for reasons to justify your initial impression.
I mean, I doubt you’re claiming that you were simply studying the situation and, after accumulating various bits of evidence with no preconception whatsoever, suddenly stumbled upon a realization that this was the way it was!
Instead, the sequence was almost certainly that you read the post, felt something bothering you about it, and then went looking for what already bothered you. And what bothered you was not a product of reasoning-in-the-moment, but merely a cached thought… perhaps that persuasion is bad, or bad for rationalists, or maybe even just a dislike of people saying something is $1/day.
Now, I’m not saying this is unique to you, or that I don’t do it. After all, I went through the exact same sequence in order to reply to you!
I’m just saying, I don’t think it’s a good idea to trust it when I get “a bad feeling about this” until I’ve rooted out the cached thought and crosschecked it against the actual situation. And even knowing this, I still forget or goof it up. A LOT.
Which means I have a low expectation of trustworthiness when someone speaks highly of their gut feelings instead, as though they were some sort of verification of truth, rather than a highly suspicious sign of cached thoughts and bottom-line reasoning.
I don’t have time to continue this discussion right now. I just wanted to mention something that’s bothering me. Right now it looks like I’m getting voted up and you down, and that’s stupid. Your comments aren’t of lower quality than mine, they’re simply disagreeing with mine. There seem to be only a couple people doing it (I don’t vote on threads I’m involved in), so I say to them… please vote according to the quality and relevance of the comment, not how much you like the content. Vote up to signal agreement if you must, but don’t vote down if the comment is clearly on topic and well written.
I’m afraid that I disagree with you on the quality of PJ Eby’s contributions, and that I have neither the time nor inclination to take direct part in this thread.
please vote according to the quality and relevance of the comment, not how much you like the content. Vote up to signal agreement if you must, but don’t vote down if the comment is clearly on topic and well written.
On behalf of all those who suffer from extreme karma-loss-aversion, I want to second this message. Please don’t downvote just because you disagree on substance.
(I came close to deleting a recent comment of mine that was downvoted shortly after being posted. It’s now at +12.)
So what do you turn to on Less Wrong, when your rational argument fails for no apparent rational reason? The dilemma runs like this:
I make my awesome rational argument for the expected benefit of doing X
Rationalists on LW decline to do X because of reasons that look not-very-rational to me
I try roundabout arguments to try and sneak past whatever bias is keeping people from getting with the program
At least, that’s how it looks if you accept the pro-cryonics arguments. Either way, the lack of consensus (and action) on cryonics seems indicative to me of a substantial failure of rationality somewhere… but I’m not sure where.
ETA: I am not advocating the use of “marketing tactics” here, I’m trying to raise the question of what to do when you think your rational argument is triggering something like an absolute denial macro in other rationalists for unknown reasons. “Reconsider if your argument is correct” is a valid response, as is “give up for now”.
Please don’t overuse the concept “absolute denial macro”. This refers to inability to notice your left arm is paralyzed. You’re talking about strictly ordinary denial.
I was trying to use it in roughly the same sense that taw was in the post I linked to. Upon reflection, yes, it’s silly in this context.
I’m uncomfortable with the reification of an “absolute denial macro”, period. I think the observed behavior is more likely to be the product of a damage-induced cognitive deficit rather than some macro-like subroutine that gets activated inappropriately.
I argue that doing X is rational.
Other people irrationally disagree
I abandon the rational argument and use other tactics to make them agree with me
I know down to my gut this is a bad idea.
If rationalists on LW are being irrational, let them grow to become better rationalists and try again. To act otherwise is to fall prey to the Dark Arts. Accepting a rational belief for irrational reasons means you can reject that belief for equally irrational reasons.
There is a large, but subtle, difference between these scenarios:
Abandon rational argument and use manipulative tactics
Rework and rephrase the rational argument so as to avoid triggering some reflexive bias
I’m certainly not advocating the former and didn’t mean to sound like I was.
Removing an irrational bias against doing something is not the same thing as adding an irrational reason for doing it. The per diem strategy is aimed at removing one of the perceived penalties of choosing cryonics, not adding another reason to do it.
Existing beliefs get the benefits of status-quo bias and confirmation bias. It takes work to remove them. So your statement above is nonsense.
It’s unlikely that you got your current so-called rational beliefs through rational methods. Like, win-the-lottery unlikely. Do you think that any of Eliezers’ posts are convincing purely for rational reasons? If so, you are engaging in self-delusion.
Which in this context, only increases the probability that you’re wrong, and for an utterly irrational reason.
Now, if you knew down to your gut that something was a good idea, and it made you happy to think so, I would be far less suspicious. For whatever reason, negative emotions seem to bias our reasoning processes much more than positive ones do.
It can easily be read in the form of how cheap it is, rather than how expensive it isn’t.
I didn’t intend to mean reject that belief after accepting it. I meant that an irrational argument for a belief can be negated by an irrational argument against it. If you say the sky is blue because it is made of blue jeans, I can counter your argument by saying the sky is blue because it is made of paint. Obviously it doesn’t get us anywhere. Make a rational argument or don’t make an argument at all. If it helps, modify “Accepting” to “To be able to accept”.
I’m not sure what you mean. I am certain I hold many beliefs that I was persuaded to accept by very convincing and skilled people, Eliezer among them. I think Eliezer writes very persuasively. Persuasive methods do not, however, imply the intended or even accidental employment of irrational methods when making a case. And if I thought I gained most or all of my “current so-called rational beliefs” through irrational methods, I would be very disturbed. The beliefs I am aware of currently adhering to were not put there by some motivational speaker but have been cultivated through study, reflection and revision.
I thought my phrasing made it clear enough that it wasn’t just a gut feeling, but that my opposition to it was strong to the point where I actually had a gut feeling about it, not just an intellectual position. Instead of merely thinking “I know that’s wrong” I was thinking “How could that possibly be right.”
This reminds me of an amusing segment of one of Frank Kern’s marketing classes, in which he explains that it doesn’t matter how cheap the product is if no one wants it. He then proceeds to illustrate this idea by joking about having a “new” device here that will cut off a chunk of a certain masculine body part, for “only” $1/day.
IOW, believing that cryonics is cheap is only relevant if you want it in the first place.
Ridiculous! Everyone knows the sky is made of blue cheese.
(Seriously, your argument makes no more sense than that. There is no direct causal relationship between the argument that convinces you of a thing, and the argument that talks you back out of it or into something else.)
Perhaps “arational” would be a better word. The point is, when you are enjoying listening to a story, you are not engaging in reasoning, as it’s incompatible with experiencing the story.
Which is what keeps you motivated to not pay very close attention to what beliefs you actually have, and act on.
...are a ridiculously tiny subset of the beliefs you actually hold and act upon, accumulated by observation, priming, and simple conditioning—none of which include any conscious involvement, awareness, study, or reflection, let alone reasoning.
And these beliefs—whether irrational or arational—form the framework through which you view the world and make judgments about it… including such judgments as to which reflected, conscious beliefs you should hold.
My point is that the feeling invariably precedes the logic, because it’s the feeling that told you to look for reasons to justify your initial impression.
I mean, I doubt you’re claiming that you were simply studying the situation and, after accumulating various bits of evidence with no preconception whatsoever, suddenly stumbled upon a realization that this was the way it was!
Instead, the sequence was almost certainly that you read the post, felt something bothering you about it, and then went looking for what already bothered you. And what bothered you was not a product of reasoning-in-the-moment, but merely a cached thought… perhaps that persuasion is bad, or bad for rationalists, or maybe even just a dislike of people saying something is $1/day.
Now, I’m not saying this is unique to you, or that I don’t do it. After all, I went through the exact same sequence in order to reply to you!
I’m just saying, I don’t think it’s a good idea to trust it when I get “a bad feeling about this” until I’ve rooted out the cached thought and crosschecked it against the actual situation. And even knowing this, I still forget or goof it up. A LOT.
Which means I have a low expectation of trustworthiness when someone speaks highly of their gut feelings instead, as though they were some sort of verification of truth, rather than a highly suspicious sign of cached thoughts and bottom-line reasoning.
I don’t have time to continue this discussion right now. I just wanted to mention something that’s bothering me. Right now it looks like I’m getting voted up and you down, and that’s stupid. Your comments aren’t of lower quality than mine, they’re simply disagreeing with mine. There seem to be only a couple people doing it (I don’t vote on threads I’m involved in), so I say to them… please vote according to the quality and relevance of the comment, not how much you like the content. Vote up to signal agreement if you must, but don’t vote down if the comment is clearly on topic and well written.
I’m afraid that I disagree with you on the quality of PJ Eby’s contributions, and that I have neither the time nor inclination to take direct part in this thread.
Oh great, now they’re downvoting you, too. ;-)
On behalf of all those who suffer from extreme karma-loss-aversion, I want to second this message. Please don’t downvote just because you disagree on substance.
(I came close to deleting a recent comment of mine that was downvoted shortly after being posted. It’s now at +12.)