More knowledge about bias, which would particularly undermine the unfortunately common and well regarded stance “I only believe what I see”. People rely too much on their direct feelings/intuitions without assessing them.
The idea that in order to have an accurate representation of reality, one must have background knowledge in science. Add in a little philosophy (recent philosophy, like Popper).
Also praising the ones who admit their mistakes—that happens too little.
The final idea would be like yours, more Bayesian thinking.
The most funny thing about Popper is that I don’t get the impression that he or most of the people reading him seek to falsify his theories. Often because someone, as it’s philosophy the rules of falsification don’t matter.
Popper didn’t try to study scientists and how scientists come up with new scientific findings to try to falsify his hypothesis.
For centuries, philosophers wondered how we could learn what causes what. Some argued it was impossible, or possible only via experiment. [...] Then, in the 1990s, a breakthrough: Judea Pearl and others showed that, in principle, we can sometimes infer causal relations from data even without experiment, via the mathematical machinery of probabilistic graphical models.
Popper isn’t really a recent thinker. He wrote 50 years ago.
Jaynes wrote his book in the 1990s. Kahnemans works wasn’t known before that time.
We have modern tools to deal with uncertainty like credence calibration. We have found that in cases with low costs of false positives trusting intuition is highly useful and that most experts make a lot of their decisions based on intuition rather than analytical reasoning.
I may be mad, but I actually think of Popper more or less in the same breath as Bayesianism—modus tollens and reductio (the main methods of Popperian “critical rationalism”—CR basically says that the reductio is the model of all successful empirical reasoning) just seem to me to be special cases of Bayesianism. The idea with both (as I see it) is that we start where we are and get to the truth by shaving away untruths, by testing our ideas to destruction and going with what’s left standing because we’ve got nothing better left standing—that seems to me the basic gist of both philosophies.
I’m also fond of the idea that knowledge is always conjecture, and that belief has nothing to do with knowledge (and knowledge can occasionally be accidental). Knowledge is just the “aperiodic crystals” of language in its manifest forms (ink on paper, sounds out of a mouth, coding, or whatever), which, by convention (“language games”), represent or model reality either accurately or not, regardless of psychological state of belief.
Furthermore, while I’m on my high horse, Bayesianism is conjectural deductive reasoning—neither “subjective” nor “objective” approaches have anything to do with it. It doesn’t “update beliefs” it updates, modifies, discards, conjectures.
IOW, you take a punt, a bet, a conjecture (none of which have anything to do with belief) at how things are, objectively. The punt is itself in the form of a “language crystal”, objectively out there in reality, in some embodied form, which is something embedded in reality that conventionally models reality, as above—again, nothing to do with belief.
In this context, truth and objectivity (in another sense) are ideals—things we’re aiming for. It may be the case that there is no true proposition, but when we say we have a probably true proposition, what that means is that we have a ranking of conjectures against each other, in a ratio, and the most probable is the most provable (the one that can be best corroborated—in the Popperian sense—by evidence). That’s all.
I agree with gurugeorge response and see Popper the same way.
That said, I do think that although Pearl’s work is great, the key word is “in principle”—the methods rely on a number of assumptions that you can’t test (like independance) and he also says that the experiment is the only guaranteed way to establish causation (in his talk the art and science of cause and effect). I also may be wrong, as this talk was given in 1996, he might have changed his mind.
Moreover, your “trust your intuitions sometimes” is misleading: it is still not simply trusting your intuitions, it is trusting them only in the cases where there is data suggesting that intuition gives better results in similar cases. It has data behind it—the intuition is not taken for granted.
As Popper wrote, sensory data comes through organs that aren’t ‘perfect’ sensers. Our brain is also not a ‘perfect’ thinker. We know all that thanks to our knowledge of evolution—and that’s the starting point of Popper. Popper didn’t have Kahnemans’ or Pearl works, but he still encouraged critical thinking of hypotheses while not treating intuitions as given (only as hypotheses, and only if they were falsifiable), and falsification is still the basis of science at this moment.
the methods rely on a number of assumptions that you can’t test (like independance).
You can test independence. There is a ton of frequentist literature on hypothesis testing, and Bayesian methods too, of course. Did you mean something else?
I wasn’t very clear, and probably misleading. Although I’m not an expert, I have “read” Pearl’s book a few years ago (Causality: Models, Reasoning, and Inference, it’s available as a pdf) and it really seemed to me that some independence was hard to test, and sometimes was an assumption given the system.
It’s also true that I haven’t read it deeper now that I have a bit more knowledge, and I lack time to do so.
If you have more hindsight about that, I would love to read it.
More knowledge about bias, which would particularly undermine the unfortunately common and well regarded stance “I only believe what I see”. People rely too much on their direct feelings/intuitions without assessing them.
The idea that in order to have an accurate representation of reality, one must have background knowledge in science. Add in a little philosophy (recent philosophy, like Popper).
Also praising the ones who admit their mistakes—that happens too little.
The final idea would be like yours, more Bayesian thinking.
I’m probably too optimistic.
What do you mean with that? Especially while you try to treat Popper as an important thinker?
Updating priors with evidence. Standing by your beliefs seems to be praised—at least where I live.
I consider Popper as an important thinker, falsification is quite important right now for example. Why do you seem to think he’s unimportant ?
The most funny thing about Popper is that I don’t get the impression that he or most of the people reading him seek to falsify his theories. Often because someone, as it’s philosophy the rules of falsification don’t matter. Popper didn’t try to study scientists and how scientists come up with new scientific findings to try to falsify his hypothesis.
From a more LW perspective Lukeprog writes:
Popper isn’t really a recent thinker. He wrote 50 years ago. Jaynes wrote his book in the 1990s. Kahnemans works wasn’t known before that time.
We have modern tools to deal with uncertainty like credence calibration. We have found that in cases with low costs of false positives trusting intuition is highly useful and that most experts make a lot of their decisions based on intuition rather than analytical reasoning.
I may be mad, but I actually think of Popper more or less in the same breath as Bayesianism—modus tollens and reductio (the main methods of Popperian “critical rationalism”—CR basically says that the reductio is the model of all successful empirical reasoning) just seem to me to be special cases of Bayesianism. The idea with both (as I see it) is that we start where we are and get to the truth by shaving away untruths, by testing our ideas to destruction and going with what’s left standing because we’ve got nothing better left standing—that seems to me the basic gist of both philosophies.
I’m also fond of the idea that knowledge is always conjecture, and that belief has nothing to do with knowledge (and knowledge can occasionally be accidental). Knowledge is just the “aperiodic crystals” of language in its manifest forms (ink on paper, sounds out of a mouth, coding, or whatever), which, by convention (“language games”), represent or model reality either accurately or not, regardless of psychological state of belief.
Furthermore, while I’m on my high horse, Bayesianism is conjectural deductive reasoning—neither “subjective” nor “objective” approaches have anything to do with it. It doesn’t “update beliefs” it updates, modifies, discards, conjectures.
IOW, you take a punt, a bet, a conjecture (none of which have anything to do with belief) at how things are, objectively. The punt is itself in the form of a “language crystal”, objectively out there in reality, in some embodied form, which is something embedded in reality that conventionally models reality, as above—again, nothing to do with belief.
In this context, truth and objectivity (in another sense) are ideals—things we’re aiming for. It may be the case that there is no true proposition, but when we say we have a probably true proposition, what that means is that we have a ranking of conjectures against each other, in a ratio, and the most probable is the most provable (the one that can be best corroborated—in the Popperian sense—by evidence). That’s all.
I agree with gurugeorge response and see Popper the same way.
That said, I do think that although Pearl’s work is great, the key word is “in principle”—the methods rely on a number of assumptions that you can’t test (like independance) and he also says that the experiment is the only guaranteed way to establish causation (in his talk the art and science of cause and effect). I also may be wrong, as this talk was given in 1996, he might have changed his mind.
Moreover, your “trust your intuitions sometimes” is misleading: it is still not simply trusting your intuitions, it is trusting them only in the cases where there is data suggesting that intuition gives better results in similar cases. It has data behind it—the intuition is not taken for granted.
As Popper wrote, sensory data comes through organs that aren’t ‘perfect’ sensers. Our brain is also not a ‘perfect’ thinker. We know all that thanks to our knowledge of evolution—and that’s the starting point of Popper. Popper didn’t have Kahnemans’ or Pearl works, but he still encouraged critical thinking of hypotheses while not treating intuitions as given (only as hypotheses, and only if they were falsifiable), and falsification is still the basis of science at this moment.
You can test independence. There is a ton of frequentist literature on hypothesis testing, and Bayesian methods too, of course. Did you mean something else?
I wasn’t very clear, and probably misleading. Although I’m not an expert, I have “read” Pearl’s book a few years ago (Causality: Models, Reasoning, and Inference, it’s available as a pdf) and it really seemed to me that some independence was hard to test, and sometimes was an assumption given the system. It’s also true that I haven’t read it deeper now that I have a bit more knowledge, and I lack time to do so.
If you have more hindsight about that, I would love to read it.