It’s not that there aren’t any people whose unsupported assertion is more trustworthy than an explicit, persuasive-sounding argument for the opposite side, though certainly individuals of such discernment and integrity are rare. The issue is that any person so reliable would also necessarily have enough underlying intelligence to be able to, in any situation not involving implausibly extreme levels of time pressure, construct a better (or at least more contextually specific) argument than “you’ll understand when you’re older.” The only plausible explanation for being so vague is if they not only don’t want to tell you, but are further trying not to provide enough keywords for you to look up the real reason yourself.
I’m slightly less cynical; I think they usually do in fact genuinely believe that you’ll change your mind and agree with them many years later. The people I’ve seen this with tend not to be good at putting feelings into words.
By the way, I’d love to see someone steelman the experience argument (but am too lazy to do myself). Anyone up for it?
I’m not saying that someone making the “understand when you’re older” argument is being dishonest. They might not even be incorrect. It’s just that, if that’s the best case they can come up with, even after thinking it over, you’re probably better off making your decision on some basis other than their opinion.
In addition, experience should be transferable; in other words, if you think there’s some “experience” to be had that will convince me that I’m wrong about something, you should be able to convey the details of such an experience (as well as why you think the experience would be convincing) to me directly, e.g. Quirrell’s conversation with Hermione:
“In all honesty,” said Professor Quirrell, looking up at the stars, “I still don’t understand it. They should have known that their lives depended on that man’s success. And yet it was as if they tried to do everything they could to make his life unpleasant. To throw every possible obstacle into his way. I was not naive, Miss Granger, I did not expect the power-holders to align themselves with me so quickly—not without something in it for themselves. But their power, too, was threatened; and so I was shocked how they seemed content to step back, and leave to that man all burdens of responsibility. They sneered at his performance, remarking among themselves how they would do better in his place, though they did not condescend to step forward.” Professor Quirrell shook his head as though in bemusement. “And it was the strangest thing—the Dark Wizard, that man’s dread nemesis—why, those who served him leapt eagerly to their tasks. The Dark Wizard grew crueler toward his followers, and they followed him all the more. Men fought for the chance to serve him, even as those whose lives depended on that other man made free to render his life difficult… I could not understand it, Miss Granger.” Professor Quirrell’s face was in shadow, as he looked upward. “Perhaps, by taking on himself the curse of action, that man removed it from all others? Was that why they felt free to hinder his battle against the Dark Wizard who would have enslaved them all? Believing men would act in their own interest was not cynicism, it turned out, but sheerest optimism; in reality men do not meet so high a standard. And so in time that one realized he might do better fighting the Dark Wizard alone, than with such followers at his back.”
Quirrell didn’t just say, “Oh, you’ll change your mind later when you experience life a bit more” (although he has done that to Harry sometimes); he actually laid out the details of the argument. That’s not to say he was right, but at least his argument is a lot more convincing than just a naked claim that “you’ll change your mind with experience”.
In other words, any time someone makes the experience argument and is able to do so in a legitimate manner, he/she should also be able to give a fairly concise summary of the experience as well as why it should be convincing to his/her opponent. Ideally, then, no one should make the experience argument at all, because every time someone does so, either (a) the argument is illegitimate or (b) there’s a better argument readily available that should be used instead. Because of this, if someone makes the experience argument to me without an accompanying summary, then it immediately tells me that he/she is just using it as a semantic stopsign and is not arguing in good faith. This in turn leads me to get out of the conversation rather quickly, if possible.
If someone tells me that a complex mathematical proof comes to a certain conclusion, than often I have to make a decision about trusting their expertise or spending a few semesters studying math to get the underlying understanding to be able to follow the proof myself.
Even in the case you described, it should be possible to lay out something to support the argument, even if it takes too long to cover the argument itself. Like, instead of just saying “I know this proof seems counterintuitive right now, but trust me, once you study a bit more, it’ll all make sense,” the person would do better to say, “Well, I know it sounds absurd that you’d be able to take a single object, disassemble it, and reassemble it to form two of the same object, but in fact it has been proven to be possible given infinite divisibility and something called the Axiom of Choice. If you’re not familiar with that, I’d suggest reading a bit about set theory.” My credence in the latter case would be much higher than in the former case, and it didn’t seem to take that long just to say, “Okay, this statement has its roots in X, Y, and Z, so to understand, you’ll want to study those.” I maintain that it should be possible to give some context for the experience argument no matter what, and that if you don’t do so, you’re not trying to argue in good faith.
“Well, I know it sounds absurd that you’d be able to take a single object, disassemble it, and reassemble it to form two of the same object, but in fact it has been proven to be possible given infinite divisibility and something called the Axiom of Choice. If you’re not familiar with that, I’d suggest reading a bit about set theory.”
I actually can give you an “intuitive” justification of the Banach-Tarski theorem.
Suppose you have a rigid ball full of air. If you take half the air out and put it into another, identical ball, you now have twice the volume of air, at half the density. However, the points in a mathematical ball are infinitely dense—half of infinity is still infinity, so it turns out that if you do it just right, you can take out “half” of the points from a mathematical ball, put it inside another one, and end up with two balls that are both “completely full” and identical to the original one.
Your explanation suggests the wrong intuition for Banach-Tarski.
It’s relatively easy to show that there’s a bijection between the points contained in one ball and the points contained in two balls. (Similarly, there is a bijection between the interval [0,1] and the interval [0,2].)
The Banach-Tarski theorem proves a harder statement: you can take a unit ball, partition it into finitely many pieces (I think it can be done with five), and then rearrange those pieces, using only translations and rotations, into two unit balls.
(If there’s a canonical weird thing about the theorem, it’s that we can do this in three dimensions but not in two.)
Agreed; it’s not a real justification, it’s just something that makes it sound less absurd. (When you look at the theorem a little bit closer, the weird part becomes not that you can make two balls out of one ball, but that you can do it with just translations and rotations. And if you look really, really hard, the weird part becomes that you can’t do it with only four pieces.)
This intuitive justification likewise indicates that one should be able to do the Banach-Tarski thing with a 2-dimensional disc rather than a 3-dimensional ball. Unfortunately, that isn’t true. (Though it is if you allow area-preserving affine transformations as well as isometries.)
My steelman is this (without having read anything downstairs, so I apologise if there’s a better on extant): the world is a complicated place, and we all form beliefs based on the things we think are important in the world; and since we are all horrible reasoners, it’s impossible to believe about some things that they are important movers of the world without seeing it actually happen and viscerally feeling it change things.
Cognitive biases in yourself are like this, methinks. Your thought processes really need to be broken down repeatedly for you to be able to start seeing the subtle shifts happening inside you—and anticipating that they happen even when you don’t see them (generalising from many examples here, but not nearly enough).
Another difficult tripping point for me was intuitive reasoning. Till I saw people who couldn’t make any sense do significantly better than me I could not possibly believe it, even fighting against people who told me I over-analysed and spoke too much.
I’m slowly coming around on dishonest rhetorical stances, because of the amount of time I’ve spent trying to convince hostile arguers. Let me soothe the raised hackles of your inner LW-cat by saying that I can’t endorse anything like this without finding a Schelling fence,* and am willing to consider anyone who takes such a stance on LW (or in LW-related contexts) evil.
*In fact, based on the world being as it is, I strongly suspect there isn’t one.
I’m a professional computer programmer, a field founded on logic and reason. I’ve been doing it for a while, and am, if I say so myself, pretty good at it.
I still find myself frequently hitting places where the best argument I can make for a particular decision is “this feels intuitively like decision x and decision y, and those were the correct choices in those cases, therefore I think this is the correct decision too.” And very often that’s right.
My understanding of the evidence is that within specific fields, experts in that field develop intuitions that really can yield better decisionmaking than conscious reason. Logically, doesn’t it seem like this would be true of living in general?
Mentioning a similarity to past successful decisions seems like it qualifies as “constructing a more contextually specific argument than ‘you’ll understand when you’re older’”.
I guess, but the explicit comparison is usually pretty indefensible. “Isn’t this actually more like decision w, where the opposite choice was correct?” would be a natural response, and one I wouldn’t have any counterargument for.
The issue is that any person so reliable would also necessarily have enough underlying intelligence to be able to, in any situation not involving implausibly extreme levels of time pressure, construct a better (or at least more contextually specific) argument than “you’ll understand when you’re older.”
Excellent point, thanks a bunch for supplying it. That makes a lot of sense to me.
It’s not that there aren’t any people whose unsupported assertion is more trustworthy than an explicit, persuasive-sounding argument for the opposite side, though certainly individuals of such discernment and integrity are rare. The issue is that any person so reliable would also necessarily have enough underlying intelligence to be able to, in any situation not involving implausibly extreme levels of time pressure, construct a better (or at least more contextually specific) argument than “you’ll understand when you’re older.” The only plausible explanation for being so vague is if they not only don’t want to tell you, but are further trying not to provide enough keywords for you to look up the real reason yourself.
I’m slightly less cynical; I think they usually do in fact genuinely believe that you’ll change your mind and agree with them many years later. The people I’ve seen this with tend not to be good at putting feelings into words.
By the way, I’d love to see someone steelman the experience argument (but am too lazy to do myself). Anyone up for it?
I’m not saying that someone making the “understand when you’re older” argument is being dishonest. They might not even be incorrect. It’s just that, if that’s the best case they can come up with, even after thinking it over, you’re probably better off making your decision on some basis other than their opinion.
In addition, experience should be transferable; in other words, if you think there’s some “experience” to be had that will convince me that I’m wrong about something, you should be able to convey the details of such an experience (as well as why you think the experience would be convincing) to me directly, e.g. Quirrell’s conversation with Hermione:
Quirrell didn’t just say, “Oh, you’ll change your mind later when you experience life a bit more” (although he has done that to Harry sometimes); he actually laid out the details of the argument. That’s not to say he was right, but at least his argument is a lot more convincing than just a naked claim that “you’ll change your mind with experience”.
In other words, any time someone makes the experience argument and is able to do so in a legitimate manner, he/she should also be able to give a fairly concise summary of the experience as well as why it should be convincing to his/her opponent. Ideally, then, no one should make the experience argument at all, because every time someone does so, either (a) the argument is illegitimate or (b) there’s a better argument readily available that should be used instead. Because of this, if someone makes the experience argument to me without an accompanying summary, then it immediately tells me that he/she is just using it as a semantic stopsign and is not arguing in good faith. This in turn leads me to get out of the conversation rather quickly, if possible.
Not every experience has a concise summary.
If someone tells me that a complex mathematical proof comes to a certain conclusion, than often I have to make a decision about trusting their expertise or spending a few semesters studying math to get the underlying understanding to be able to follow the proof myself.
Even in the case you described, it should be possible to lay out something to support the argument, even if it takes too long to cover the argument itself. Like, instead of just saying “I know this proof seems counterintuitive right now, but trust me, once you study a bit more, it’ll all make sense,” the person would do better to say, “Well, I know it sounds absurd that you’d be able to take a single object, disassemble it, and reassemble it to form two of the same object, but in fact it has been proven to be possible given infinite divisibility and something called the Axiom of Choice. If you’re not familiar with that, I’d suggest reading a bit about set theory.” My credence in the latter case would be much higher than in the former case, and it didn’t seem to take that long just to say, “Okay, this statement has its roots in X, Y, and Z, so to understand, you’ll want to study those.” I maintain that it should be possible to give some context for the experience argument no matter what, and that if you don’t do so, you’re not trying to argue in good faith.
I actually can give you an “intuitive” justification of the Banach-Tarski theorem.
Suppose you have a rigid ball full of air. If you take half the air out and put it into another, identical ball, you now have twice the volume of air, at half the density. However, the points in a mathematical ball are infinitely dense—half of infinity is still infinity, so it turns out that if you do it just right, you can take out “half” of the points from a mathematical ball, put it inside another one, and end up with two balls that are both “completely full” and identical to the original one.
Your explanation suggests the wrong intuition for Banach-Tarski.
It’s relatively easy to show that there’s a bijection between the points contained in one ball and the points contained in two balls. (Similarly, there is a bijection between the interval [0,1] and the interval [0,2].)
The Banach-Tarski theorem proves a harder statement: you can take a unit ball, partition it into finitely many pieces (I think it can be done with five), and then rearrange those pieces, using only translations and rotations, into two unit balls.
(If there’s a canonical weird thing about the theorem, it’s that we can do this in three dimensions but not in two.)
Agreed; it’s not a real justification, it’s just something that makes it sound less absurd. (When you look at the theorem a little bit closer, the weird part becomes not that you can make two balls out of one ball, but that you can do it with just translations and rotations. And if you look really, really hard, the weird part becomes that you can’t do it with only four pieces.)
This intuitive justification likewise indicates that one should be able to do the Banach-Tarski thing with a 2-dimensional disc rather than a 3-dimensional ball. Unfortunately, that isn’t true. (Though it is if you allow area-preserving affine transformations as well as isometries.)
My steelman is this (without having read anything downstairs, so I apologise if there’s a better on extant): the world is a complicated place, and we all form beliefs based on the things we think are important in the world; and since we are all horrible reasoners, it’s impossible to believe about some things that they are important movers of the world without seeing it actually happen and viscerally feeling it change things.
Cognitive biases in yourself are like this, methinks. Your thought processes really need to be broken down repeatedly for you to be able to start seeing the subtle shifts happening inside you—and anticipating that they happen even when you don’t see them (generalising from many examples here, but not nearly enough).
Another difficult tripping point for me was intuitive reasoning. Till I saw people who couldn’t make any sense do significantly better than me I could not possibly believe it, even fighting against people who told me I over-analysed and spoke too much.
I’m slowly coming around on dishonest rhetorical stances, because of the amount of time I’ve spent trying to convince hostile arguers. Let me soothe the raised hackles of your inner LW-cat by saying that I can’t endorse anything like this without finding a Schelling fence,* and am willing to consider anyone who takes such a stance on LW (or in LW-related contexts) evil.
*In fact, based on the world being as it is, I strongly suspect there isn’t one.
I’m a professional computer programmer, a field founded on logic and reason. I’ve been doing it for a while, and am, if I say so myself, pretty good at it.
I still find myself frequently hitting places where the best argument I can make for a particular decision is “this feels intuitively like decision x and decision y, and those were the correct choices in those cases, therefore I think this is the correct decision too.” And very often that’s right.
My understanding of the evidence is that within specific fields, experts in that field develop intuitions that really can yield better decisionmaking than conscious reason. Logically, doesn’t it seem like this would be true of living in general?
Mentioning a similarity to past successful decisions seems like it qualifies as “constructing a more contextually specific argument than ‘you’ll understand when you’re older’”.
I guess, but the explicit comparison is usually pretty indefensible. “Isn’t this actually more like decision w, where the opposite choice was correct?” would be a natural response, and one I wouldn’t have any counterargument for.
Excellent point, thanks a bunch for supplying it. That makes a lot of sense to me.