Remember, it all adds up to normality. Thus we should not be surprised that the conclusion of evo-psych agree with the traditional ideas.
We should expect a perfected biology to predict our cultural data, not to agree with our cultural beliefs. ‘Normality’ doesn’t mean our expectations. ‘Normality’ doesn’t mean common sense or folk wisdom. It means our actual experiences. See Living in Many Worlds.
When people claim that they’re final argument tends to be a lot less convincing and involve a lot more mental gymnastics than the original.
How strong is that tendency? Try to quantify it. Then test the explanations where possible, after writing down your prediction. Did the first get an unfair advantage from status quo bias? Did the rivals seem gerrymandered and inelegant because reality is complicated? Did any of the theories tend to turn out to be correct?
‘Normality’ doesn’t mean common sense or folk wisdom.
Actually, yes it does. The results of the theory should agree with our common sense and folk wisdom when dealing with situations on ordinary human scales (or whatever the appropriate analog of “ordinary human scales” is).
Actually, yes it does. The results of the theory should agree with our common sense and folk wisdom when dealing with situations on ordinary human scales
You’re making two claims here. First, you’re making a substantive claim about the general reliability of human intuitions and cultural institutions when it comes to the human realm. Second, you’re making a semantic claim about what ‘It all adds up to normality’ means.
The former doctrine would be extremely difficult to substantiate. What evidence do you have to back it up? And the latter claim is clearly not right in any sense this community uses the term, as the LW posts about Egan’s Law speak of the recreation of the ordinary world of perception, not of the confirmation of folk wisdom or tradition. The LessWrong Wiki explicitly speaks of normality as ‘observed reality’, not as our body of folk theory. Which is a good thing, since otherwise Egan’s Law would directly contradict the principle “Think Like Reality”:
“Quantum physics is not “weird”. You are weird. You have the absolutely bizarre idea that reality ought to consist of little billiard balls bopping around, when in fact reality is a perfectly normal cloud of complex amplitude in configuration space. This is your problem, not reality’s, and you are the one who needs to change.
“Human intuitions were produced by evolution and evolution is a hack.”
Indeed, I would say that this claim, that our natural intuitions and common sense and folk wisdom and traditions are wont to be systematically mistaken, is one of the most foundational LessWrong claims. It lies at the very core of the utility of the heuristics/biases literature, which is a laundry list of ways we systematically misconstrue or imperfectly construe the truth. LessWrong is about not trusting your intuitions and cultural traditions (except where they have already been independently confirmed, or where the cost of investigating them exceeds the expected benefit of bothering to confirm them—and in neither case is this concession an affirmation of any intrinsic trustworthiness on the part of ‘common sense’ or ‘intuition’ or ‘folk wisdom’ or ‘tradition’).
It is true that common sense comes from somewhere, and that the existence of intuitions and cultural assumptions is a part of ‘normality’, is part of what a theory must ultimately account for and predict. But the truth of those beliefs is not a part of ‘normality’, is not a part of the data, the explanandum. They may or may not turn out to be correct; but there is no Bayesian reason to think that they must turn out right in the end, or even that they must turn out to at all resemble the right answer.
First let me repeat part of my comment with the phrase you seem to have missed in bold:
The results of the theory should agree with our common sense and folk wisdom when dealing with situations on ordinary human scales
In particular had Newton claimed that apples fall up, that would have been reason to reject his theory.
“Human intuitions were produced by evolution and evolution is a hack.”
That nevertheless works, and frequently works better than what our System II (conscious reasoning)-based theories can do. And remember our conscious reasoning is itself also a product of evolution.
Indeed, I would say that this claim, that our natural intuitions and common sense and folk wisdom and traditions are wont to be systematically mistaken,
A program to compute the area of a circle that uses pi=3.14 will be systematically mistaken, it is also likely to be sufficiently close for all practical purposes.
LessWrong is about not trusting your intuitions and cultural traditions (except where they have already been independently confirmed, or where the cost of investigating them exceeds the expected benefit of bothering to confirm them—and in neither case is this concession an affirmation of any intrinsic trustworthiness on the part of ‘common sense’ or ‘intuition’ or ‘folk wisdom’ or ‘tradition’).
Your intuitions and cultural traditions are evidence. As for possessing “intrinsic trustworthiness” I have no idea what you mean by that phrase.
They may or may not turn out to be correct; but there is no Bayesian reason to think that they must turn out right in the end, or even that they must turn out to at all resemble the right answer.
There is a Bayesian reason to think that our intuitions will in most cases resemble the right answer, at least in the sense that GR resembles Newtonian mechanics.
But this just isn’t so. Humans get things wrong about the human realm all the time, make false generalizations and trust deeply erroneous intuitions and aphorisms every day of their lives. ‘It all adds up to normality’ places a hard constraint on all reasonable theories: They must reproduce exactly the data of ordinary life. In contrast, what you mean by ‘It all adds up to normality’ seems to be more like ‘Our naive beliefs are generally right!’ The former claim is The Law (specifically, Egan’s Law); the latter is a bit of statistical speculation, seems in tension with the historical record and the contemporary psychology literature, and even if not initially implausible would still need a lot of support before it could be treated as established Fact. So conflating these two claims is singularly dangerous and misleading.
That nevertheless works, and frequently works better than what our System II (conscious reasoning)-based theories can do.
You’re conflating three different claims.
Egan’s Law: The correct model of the world must yield the actual data/evidence we observe.
We should generally expect our traditions, intuitions, and folk theories to be correct in their human-scale claims.
Our biases are severe, but not cripplingly so, and they are quite handy given our evolutionary history and resource constraints.
‘It all adds up to normality’ means 1, not 2. And the claim I was criticizing is 2, not 3 (the one you’re now defending).
A program to compute the area of a circle that uses pi=3.14 will be systematically mistaken, it is also likely to be sufficiently close for all practical purposes.
The evidence shows that in a great many cases, our intuitions and traditions aren’t just useful approximations of the truth, like Newtonian physics; they’re completely off-base. A lot of folk wisdom asserts just the opposite of the truth, not only about metaphysics but about ordinary human history, psychology, and society. So if ‘it all adds up to normality’ means ‘it all (in the human realm) confirms our folk expectations and intuitions’, then ‘it all adds up to normality’ is false. (But, as noted, this isn’t what ‘normality’ means here.)
Your intuitions and cultural traditions are evidence.
Sure, they’re evidence; but they’re not very strong evidence, without external support. And they’re data; but the data in question is that something is intuitive, not that the intuition itself is correct. The claims made by our scientifically uncultivated intuitions and culture are just models like any other, and can be confirmed or disconfirmed like any scientific model, no matter how down-to-earth and human-scaled they are. They do not have the special status of ‘normality’ assigned to the data—data, not theory—of everyday life, that Egan’s Law draws our attention to.
When Newton(?) claimed that objects of different mass but negligible air friction fell at the same rate, that theory was rejected.
Copernicus.
Natural intuitions, common sense, and folk wisdom have consistently shown that they cannot identify a theory which explains the actual observations better than they can.
Common sense and folk wisdom say that has changed, and we will now accept a new, more correct theory without challenging it.
So the Aztec add up to normal? Because I’m not seeing how a culture that thought human sacrifice was a virtue has much folk wisdom in common with the modern era.
Why was this down-voted? This is an on-point and sane response. If ‘normality’ is defined as tradition, then history should be a sequence of confirmations of past traditions and common sense (e.g., the common sense claim that the Earth is stationary and the Sun goes around it), as opposed to disconfirming our beliefs and providing alternative explanations of our perceptual data. The epistemic disagreement between cultures, and the epistemic change within cultures, both refute this idea.
‘It all adds up to normality’ in the original sense of ‘The right theory must predict our ordinary experience’ is a correct generalization 100% of the time. (It’s The Law.) ‘It all adds up to normality’ in Eugine’s sense of ‘The right theory must agree with folk wisdom, common sense, and traditional doctrine’ is a generalization that, historically, has almost never been right strictly, and has only sometimes been right approximately.
Well, I didn’t downvote, but saying the Aztecs viewed human sacrifice as a virtue is at best an oversimplification. They sacrificed a lot of people and believed they were virtuous in doing so, but my understanding is that sacrifice within the Aztec belief system was instrumental to worship, not virtuous in itself; you wouldn’t be lauded for sacrificing your neighbor to Homer Simpson, only for sacrifices serving established religious goals.
The broader point, though, seems to be that the appeal to societal normality equally well justifies norms that call for (e.g.) sacrificing children to the rain god Tlaloc, if sufficiently entrenched. That logic seems sound to me.
I think it would be missing Tim’s point to suppose that he’s ascribing some sort of quasi-Kantian value-in-itself to Aztec meta-ethics, when all he seems to be noting is that the Aztecs got torture wrong. If you want to reserve ‘virtue’ for a more specific idea and historically bound idea in Western ethics, I doubt he’d mind your paraphrasing his point in your preferred idiom. It takes a pretty wild imagination to read Tim’s comment and think he’s saying that the Aztecs considered human sacrifice a summum bonum or unconditionally and in-all-contexts good. That’s just not what the conversation is about.
If we take as given that a nonspecific criminal act against a specific-but-not-here-identified person is required to sustain an empire, does that mean that drone strikes have virtue?
(Vague and awkward phrasing to avoid discussing a violent act against an identifiable person)
(Note the hypothetical statement; in the least convenient world that statement is provably true)
If we take as given that a nonspecific criminal act against a specific-but-not-here-identified person is required to sustain an empire, does that mean that drone strikes have virtue?
I never said anything about virtue, merely about cause and effect.
To clarify: if at a given time common sense and folk wisdom are understood to predict a result R1 from experiment E where E involves a situation on ordinary human scales (or some appropriate analog), and at some later time E is performed and gives result R2 instead, would you consider that state of affairs consistent with the rule “the results of the theory should agree with our common sense and folk wisdom”, or in conflict with it?
We should expect a perfected biology to predict our cultural data, not to agree with our cultural beliefs. ‘Normality’ doesn’t mean our expectations. ‘Normality’ doesn’t mean common sense or folk wisdom. It means our actual experiences. See Living in Many Worlds.
How strong is that tendency? Try to quantify it. Then test the explanations where possible, after writing down your prediction. Did the first get an unfair advantage from status quo bias? Did the rivals seem gerrymandered and inelegant because reality is complicated? Did any of the theories tend to turn out to be correct?
Actually, yes it does. The results of the theory should agree with our common sense and folk wisdom when dealing with situations on ordinary human scales (or whatever the appropriate analog of “ordinary human scales” is).
You’re making two claims here. First, you’re making a substantive claim about the general reliability of human intuitions and cultural institutions when it comes to the human realm. Second, you’re making a semantic claim about what ‘It all adds up to normality’ means.
The former doctrine would be extremely difficult to substantiate. What evidence do you have to back it up? And the latter claim is clearly not right in any sense this community uses the term, as the LW posts about Egan’s Law speak of the recreation of the ordinary world of perception, not of the confirmation of folk wisdom or tradition. The LessWrong Wiki explicitly speaks of normality as ‘observed reality’, not as our body of folk theory. Which is a good thing, since otherwise Egan’s Law would directly contradict the principle “Think Like Reality”:
“Quantum physics is not “weird”. You are weird. You have the absolutely bizarre idea that reality ought to consist of little billiard balls bopping around, when in fact reality is a perfectly normal cloud of complex amplitude in configuration space. This is your problem, not reality’s, and you are the one who needs to change.
“Human intuitions were produced by evolution and evolution is a hack.”
Indeed, I would say that this claim, that our natural intuitions and common sense and folk wisdom and traditions are wont to be systematically mistaken, is one of the most foundational LessWrong claims. It lies at the very core of the utility of the heuristics/biases literature, which is a laundry list of ways we systematically misconstrue or imperfectly construe the truth. LessWrong is about not trusting your intuitions and cultural traditions (except where they have already been independently confirmed, or where the cost of investigating them exceeds the expected benefit of bothering to confirm them—and in neither case is this concession an affirmation of any intrinsic trustworthiness on the part of ‘common sense’ or ‘intuition’ or ‘folk wisdom’ or ‘tradition’).
It is true that common sense comes from somewhere, and that the existence of intuitions and cultural assumptions is a part of ‘normality’, is part of what a theory must ultimately account for and predict. But the truth of those beliefs is not a part of ‘normality’, is not a part of the data, the explanandum. They may or may not turn out to be correct; but there is no Bayesian reason to think that they must turn out right in the end, or even that they must turn out to at all resemble the right answer.
First let me repeat part of my comment with the phrase you seem to have missed in bold:
In particular had Newton claimed that apples fall up, that would have been reason to reject his theory.
That nevertheless works, and frequently works better than what our System II (conscious reasoning)-based theories can do. And remember our conscious reasoning is itself also a product of evolution.
A program to compute the area of a circle that uses pi=3.14 will be systematically mistaken, it is also likely to be sufficiently close for all practical purposes.
Your intuitions and cultural traditions are evidence. As for possessing “intrinsic trustworthiness” I have no idea what you mean by that phrase.
There is a Bayesian reason to think that our intuitions will in most cases resemble the right answer, at least in the sense that GR resembles Newtonian mechanics.
But this just isn’t so. Humans get things wrong about the human realm all the time, make false generalizations and trust deeply erroneous intuitions and aphorisms every day of their lives. ‘It all adds up to normality’ places a hard constraint on all reasonable theories: They must reproduce exactly the data of ordinary life. In contrast, what you mean by ‘It all adds up to normality’ seems to be more like ‘Our naive beliefs are generally right!’ The former claim is The Law (specifically, Egan’s Law); the latter is a bit of statistical speculation, seems in tension with the historical record and the contemporary psychology literature, and even if not initially implausible would still need a lot of support before it could be treated as established Fact. So conflating these two claims is singularly dangerous and misleading.
You’re conflating three different claims.
Egan’s Law: The correct model of the world must yield the actual data/evidence we observe.
We should generally expect our traditions, intuitions, and folk theories to be correct in their human-scale claims.
Our biases are severe, but not cripplingly so, and they are quite handy given our evolutionary history and resource constraints.
‘It all adds up to normality’ means 1, not 2. And the claim I was criticizing is 2, not 3 (the one you’re now defending).
The evidence shows that in a great many cases, our intuitions and traditions aren’t just useful approximations of the truth, like Newtonian physics; they’re completely off-base. A lot of folk wisdom asserts just the opposite of the truth, not only about metaphysics but about ordinary human history, psychology, and society. So if ‘it all adds up to normality’ means ‘it all (in the human realm) confirms our folk expectations and intuitions’, then ‘it all adds up to normality’ is false. (But, as noted, this isn’t what ‘normality’ means here.)
Sure, they’re evidence; but they’re not very strong evidence, without external support. And they’re data; but the data in question is that something is intuitive, not that the intuition itself is correct. The claims made by our scientifically uncultivated intuitions and culture are just models like any other, and can be confirmed or disconfirmed like any scientific model, no matter how down-to-earth and human-scaled they are. They do not have the special status of ‘normality’ assigned to the data—data, not theory—of everyday life, that Egan’s Law draws our attention to.
When Newton(?) claimed that objects of different mass but negligible air friction fell at the same rate, that theory was rejected.
Copernicus.
Natural intuitions, common sense, and folk wisdom have consistently shown that they cannot identify a theory which explains the actual observations better than they can.
Common sense and folk wisdom say that has changed, and we will now accept a new, more correct theory without challenging it.
So the Aztec add up to normal? Because I’m not seeing how a culture that thought human sacrifice was a virtue has much folk wisdom in common with the modern era.
Why was this down-voted? This is an on-point and sane response. If ‘normality’ is defined as tradition, then history should be a sequence of confirmations of past traditions and common sense (e.g., the common sense claim that the Earth is stationary and the Sun goes around it), as opposed to disconfirming our beliefs and providing alternative explanations of our perceptual data. The epistemic disagreement between cultures, and the epistemic change within cultures, both refute this idea.
‘It all adds up to normality’ in the original sense of ‘The right theory must predict our ordinary experience’ is a correct generalization 100% of the time. (It’s The Law.) ‘It all adds up to normality’ in Eugine’s sense of ‘The right theory must agree with folk wisdom, common sense, and traditional doctrine’ is a generalization that, historically, has almost never been right strictly, and has only sometimes been right approximately.
Well, I didn’t downvote, but saying the Aztecs viewed human sacrifice as a virtue is at best an oversimplification. They sacrificed a lot of people and believed they were virtuous in doing so, but my understanding is that sacrifice within the Aztec belief system was instrumental to worship, not virtuous in itself; you wouldn’t be lauded for sacrificing your neighbor to Homer Simpson, only for sacrifices serving established religious goals.
The broader point, though, seems to be that the appeal to societal normality equally well justifies norms that call for (e.g.) sacrificing children to the rain god Tlaloc, if sufficiently entrenched. That logic seems sound to me.
I think it would be missing Tim’s point to suppose that he’s ascribing some sort of quasi-Kantian value-in-itself to Aztec meta-ethics, when all he seems to be noting is that the Aztecs got torture wrong. If you want to reserve ‘virtue’ for a more specific idea and historically bound idea in Western ethics, I doubt he’d mind your paraphrasing his point in your preferred idiom. It takes a pretty wild imagination to read Tim’s comment and think he’s saying that the Aztecs considered human sacrifice a summum bonum or unconditionally and in-all-contexts good. That’s just not what the conversation is about.
Yeah, you’re right.
Given the way they ran their empire, it would probably have collapsed without the intimidation factor that human sacrifice provided.
If we take as given that a nonspecific criminal act against a specific-but-not-here-identified person is required to sustain an empire, does that mean that drone strikes have virtue?
(Vague and awkward phrasing to avoid discussing a violent act against an identifiable person) (Note the hypothetical statement; in the least convenient world that statement is provably true)
I never said anything about virtue, merely about cause and effect.
How is that consistent with the Aztec adding up to ‘normal’ as used upthread?
I’m not sure I understand quite what this means.
To clarify: if at a given time common sense and folk wisdom are understood to predict a result R1 from experiment E where E involves a situation on ordinary human scales (or some appropriate analog), and at some later time E is performed and gives result R2 instead, would you consider that state of affairs consistent with the rule “the results of the theory should agree with our common sense and folk wisdom”, or in conflict with it?