Actually, yes it does. The results of the theory should agree with our common sense and folk wisdom when dealing with situations on ordinary human scales
You’re making two claims here. First, you’re making a substantive claim about the general reliability of human intuitions and cultural institutions when it comes to the human realm. Second, you’re making a semantic claim about what ‘It all adds up to normality’ means.
The former doctrine would be extremely difficult to substantiate. What evidence do you have to back it up? And the latter claim is clearly not right in any sense this community uses the term, as the LW posts about Egan’s Law speak of the recreation of the ordinary world of perception, not of the confirmation of folk wisdom or tradition. The LessWrong Wiki explicitly speaks of normality as ‘observed reality’, not as our body of folk theory. Which is a good thing, since otherwise Egan’s Law would directly contradict the principle “Think Like Reality”:
“Quantum physics is not “weird”. You are weird. You have the absolutely bizarre idea that reality ought to consist of little billiard balls bopping around, when in fact reality is a perfectly normal cloud of complex amplitude in configuration space. This is your problem, not reality’s, and you are the one who needs to change.
“Human intuitions were produced by evolution and evolution is a hack.”
Indeed, I would say that this claim, that our natural intuitions and common sense and folk wisdom and traditions are wont to be systematically mistaken, is one of the most foundational LessWrong claims. It lies at the very core of the utility of the heuristics/biases literature, which is a laundry list of ways we systematically misconstrue or imperfectly construe the truth. LessWrong is about not trusting your intuitions and cultural traditions (except where they have already been independently confirmed, or where the cost of investigating them exceeds the expected benefit of bothering to confirm them—and in neither case is this concession an affirmation of any intrinsic trustworthiness on the part of ‘common sense’ or ‘intuition’ or ‘folk wisdom’ or ‘tradition’).
It is true that common sense comes from somewhere, and that the existence of intuitions and cultural assumptions is a part of ‘normality’, is part of what a theory must ultimately account for and predict. But the truth of those beliefs is not a part of ‘normality’, is not a part of the data, the explanandum. They may or may not turn out to be correct; but there is no Bayesian reason to think that they must turn out right in the end, or even that they must turn out to at all resemble the right answer.
First let me repeat part of my comment with the phrase you seem to have missed in bold:
The results of the theory should agree with our common sense and folk wisdom when dealing with situations on ordinary human scales
In particular had Newton claimed that apples fall up, that would have been reason to reject his theory.
“Human intuitions were produced by evolution and evolution is a hack.”
That nevertheless works, and frequently works better than what our System II (conscious reasoning)-based theories can do. And remember our conscious reasoning is itself also a product of evolution.
Indeed, I would say that this claim, that our natural intuitions and common sense and folk wisdom and traditions are wont to be systematically mistaken,
A program to compute the area of a circle that uses pi=3.14 will be systematically mistaken, it is also likely to be sufficiently close for all practical purposes.
LessWrong is about not trusting your intuitions and cultural traditions (except where they have already been independently confirmed, or where the cost of investigating them exceeds the expected benefit of bothering to confirm them—and in neither case is this concession an affirmation of any intrinsic trustworthiness on the part of ‘common sense’ or ‘intuition’ or ‘folk wisdom’ or ‘tradition’).
Your intuitions and cultural traditions are evidence. As for possessing “intrinsic trustworthiness” I have no idea what you mean by that phrase.
They may or may not turn out to be correct; but there is no Bayesian reason to think that they must turn out right in the end, or even that they must turn out to at all resemble the right answer.
There is a Bayesian reason to think that our intuitions will in most cases resemble the right answer, at least in the sense that GR resembles Newtonian mechanics.
But this just isn’t so. Humans get things wrong about the human realm all the time, make false generalizations and trust deeply erroneous intuitions and aphorisms every day of their lives. ‘It all adds up to normality’ places a hard constraint on all reasonable theories: They must reproduce exactly the data of ordinary life. In contrast, what you mean by ‘It all adds up to normality’ seems to be more like ‘Our naive beliefs are generally right!’ The former claim is The Law (specifically, Egan’s Law); the latter is a bit of statistical speculation, seems in tension with the historical record and the contemporary psychology literature, and even if not initially implausible would still need a lot of support before it could be treated as established Fact. So conflating these two claims is singularly dangerous and misleading.
That nevertheless works, and frequently works better than what our System II (conscious reasoning)-based theories can do.
You’re conflating three different claims.
Egan’s Law: The correct model of the world must yield the actual data/evidence we observe.
We should generally expect our traditions, intuitions, and folk theories to be correct in their human-scale claims.
Our biases are severe, but not cripplingly so, and they are quite handy given our evolutionary history and resource constraints.
‘It all adds up to normality’ means 1, not 2. And the claim I was criticizing is 2, not 3 (the one you’re now defending).
A program to compute the area of a circle that uses pi=3.14 will be systematically mistaken, it is also likely to be sufficiently close for all practical purposes.
The evidence shows that in a great many cases, our intuitions and traditions aren’t just useful approximations of the truth, like Newtonian physics; they’re completely off-base. A lot of folk wisdom asserts just the opposite of the truth, not only about metaphysics but about ordinary human history, psychology, and society. So if ‘it all adds up to normality’ means ‘it all (in the human realm) confirms our folk expectations and intuitions’, then ‘it all adds up to normality’ is false. (But, as noted, this isn’t what ‘normality’ means here.)
Your intuitions and cultural traditions are evidence.
Sure, they’re evidence; but they’re not very strong evidence, without external support. And they’re data; but the data in question is that something is intuitive, not that the intuition itself is correct. The claims made by our scientifically uncultivated intuitions and culture are just models like any other, and can be confirmed or disconfirmed like any scientific model, no matter how down-to-earth and human-scaled they are. They do not have the special status of ‘normality’ assigned to the data—data, not theory—of everyday life, that Egan’s Law draws our attention to.
When Newton(?) claimed that objects of different mass but negligible air friction fell at the same rate, that theory was rejected.
Copernicus.
Natural intuitions, common sense, and folk wisdom have consistently shown that they cannot identify a theory which explains the actual observations better than they can.
Common sense and folk wisdom say that has changed, and we will now accept a new, more correct theory without challenging it.
You’re making two claims here. First, you’re making a substantive claim about the general reliability of human intuitions and cultural institutions when it comes to the human realm. Second, you’re making a semantic claim about what ‘It all adds up to normality’ means.
The former doctrine would be extremely difficult to substantiate. What evidence do you have to back it up? And the latter claim is clearly not right in any sense this community uses the term, as the LW posts about Egan’s Law speak of the recreation of the ordinary world of perception, not of the confirmation of folk wisdom or tradition. The LessWrong Wiki explicitly speaks of normality as ‘observed reality’, not as our body of folk theory. Which is a good thing, since otherwise Egan’s Law would directly contradict the principle “Think Like Reality”:
“Quantum physics is not “weird”. You are weird. You have the absolutely bizarre idea that reality ought to consist of little billiard balls bopping around, when in fact reality is a perfectly normal cloud of complex amplitude in configuration space. This is your problem, not reality’s, and you are the one who needs to change.
“Human intuitions were produced by evolution and evolution is a hack.”
Indeed, I would say that this claim, that our natural intuitions and common sense and folk wisdom and traditions are wont to be systematically mistaken, is one of the most foundational LessWrong claims. It lies at the very core of the utility of the heuristics/biases literature, which is a laundry list of ways we systematically misconstrue or imperfectly construe the truth. LessWrong is about not trusting your intuitions and cultural traditions (except where they have already been independently confirmed, or where the cost of investigating them exceeds the expected benefit of bothering to confirm them—and in neither case is this concession an affirmation of any intrinsic trustworthiness on the part of ‘common sense’ or ‘intuition’ or ‘folk wisdom’ or ‘tradition’).
It is true that common sense comes from somewhere, and that the existence of intuitions and cultural assumptions is a part of ‘normality’, is part of what a theory must ultimately account for and predict. But the truth of those beliefs is not a part of ‘normality’, is not a part of the data, the explanandum. They may or may not turn out to be correct; but there is no Bayesian reason to think that they must turn out right in the end, or even that they must turn out to at all resemble the right answer.
First let me repeat part of my comment with the phrase you seem to have missed in bold:
In particular had Newton claimed that apples fall up, that would have been reason to reject his theory.
That nevertheless works, and frequently works better than what our System II (conscious reasoning)-based theories can do. And remember our conscious reasoning is itself also a product of evolution.
A program to compute the area of a circle that uses pi=3.14 will be systematically mistaken, it is also likely to be sufficiently close for all practical purposes.
Your intuitions and cultural traditions are evidence. As for possessing “intrinsic trustworthiness” I have no idea what you mean by that phrase.
There is a Bayesian reason to think that our intuitions will in most cases resemble the right answer, at least in the sense that GR resembles Newtonian mechanics.
But this just isn’t so. Humans get things wrong about the human realm all the time, make false generalizations and trust deeply erroneous intuitions and aphorisms every day of their lives. ‘It all adds up to normality’ places a hard constraint on all reasonable theories: They must reproduce exactly the data of ordinary life. In contrast, what you mean by ‘It all adds up to normality’ seems to be more like ‘Our naive beliefs are generally right!’ The former claim is The Law (specifically, Egan’s Law); the latter is a bit of statistical speculation, seems in tension with the historical record and the contemporary psychology literature, and even if not initially implausible would still need a lot of support before it could be treated as established Fact. So conflating these two claims is singularly dangerous and misleading.
You’re conflating three different claims.
Egan’s Law: The correct model of the world must yield the actual data/evidence we observe.
We should generally expect our traditions, intuitions, and folk theories to be correct in their human-scale claims.
Our biases are severe, but not cripplingly so, and they are quite handy given our evolutionary history and resource constraints.
‘It all adds up to normality’ means 1, not 2. And the claim I was criticizing is 2, not 3 (the one you’re now defending).
The evidence shows that in a great many cases, our intuitions and traditions aren’t just useful approximations of the truth, like Newtonian physics; they’re completely off-base. A lot of folk wisdom asserts just the opposite of the truth, not only about metaphysics but about ordinary human history, psychology, and society. So if ‘it all adds up to normality’ means ‘it all (in the human realm) confirms our folk expectations and intuitions’, then ‘it all adds up to normality’ is false. (But, as noted, this isn’t what ‘normality’ means here.)
Sure, they’re evidence; but they’re not very strong evidence, without external support. And they’re data; but the data in question is that something is intuitive, not that the intuition itself is correct. The claims made by our scientifically uncultivated intuitions and culture are just models like any other, and can be confirmed or disconfirmed like any scientific model, no matter how down-to-earth and human-scaled they are. They do not have the special status of ‘normality’ assigned to the data—data, not theory—of everyday life, that Egan’s Law draws our attention to.
When Newton(?) claimed that objects of different mass but negligible air friction fell at the same rate, that theory was rejected.
Copernicus.
Natural intuitions, common sense, and folk wisdom have consistently shown that they cannot identify a theory which explains the actual observations better than they can.
Common sense and folk wisdom say that has changed, and we will now accept a new, more correct theory without challenging it.