Interesting responses. Given that many of them miss the point, the point about a lack of examples is well taken. It is evidently impossible to communicate without using examples.
Now that’s a hypothesis with only a bit of evidence and not much confidence, and the heuristic I’m getting at here would suggest that I really ought to consider collecting a wider sample. Maybe write 10 big comments randomly assigned to be exampleful or not exampleful, and see what actual correlations come up. Note that if I don’t do that, I would have to think about subtle distinctions and effects, and many possibilities, but if I do, there’s no room for such philosophy; the measurements would make it clear with little interpretation required.
And that forms our first example of what I’m trying to get at; when you form a hypothesis, it’s a good idea to immediately think of whether there is an experiment that could disambiguate so far that you could think of it as a simple fact, or alternatively reveal your hypothesis as wrong. This is just the virtue of empiricism, which I previously didn’t take seriously.
Maybe this is only useful to me due to my particular mental state before and after this idea, and because of the work I do. So here’s some examples of the kind of stuff I had in mind to be clear:
Suppose you are designing a zinc-air alkaline fuel cell (as I am) and you see a funny degradation in the voltage over extended time, and a probe voltage wandering around. Preliminary investigations reveal (or do they?) that it’s an ohmic (as opposed to electrochemical) effect in the current transfer parts. The only really serious hypothesis is that there is some kind of contact corrosion due to leaking. Great, we know what it is, let’s rip it apart and rebuild it with some fix for that (solder). “No” says the Crush Your Uncertainty heuristic, “kick it when it’s down, kill all other hypotheses, prove it beyond all suspicion.”
So you do; you take it apart and painstakingly measure the resistance between all points and note the peculiar distribution of resistances very much characteristic to a corrosion issue in the one particular spot. But then oh look, the resistance depends on how hard you push on it (corrosion issue), the pins are corroded, and the contact surface has caustic electrolyte in it. And then while we’re at it, we notice that the corrosion doesn’t correlate with the leaks, and is basically everywhere conditional on any leak, because the nickel current distribution mesh wicks the electrolyte all over the place if it gets anywhere. And you learn a handful of other things (for example, why there are leaks, which was incidentally revealed in the thorough analysis of the contact issue).
...And we rip it apart and rebuild it with some solder. The decision at hand didn’t change, but the information gained killed all uncertainty and enlightened us about other stuff.
So then imagine that you need lots of zinc in a particular processed form to feed your fuel cell, and the guys working that angle are debugging their system so there’s never enough. In conversation it’s revealed that you think there was 3 kg delivered, and they think they delivered 7. (For various reasons you can’t directly measure it.) That’s a pretty serious mistake. On closer analysis with better measurements next time, you both estimate ~12 kg. OK there’s actually no problem; we can move on to other things. “No” says the Crush Your Uncertainty heuristic, “kick it when it’s down.” This time you don’t listen.
...And it comes back to bite you. Turns out it was the “closer” inspection that was wrong (or was it?) Now its late in the game the CEO is looking for someone to blame, and there’s some major problem that you would have revealed earlier if you’d listened to the CYU heuristic.
There’s a bunch of others, mostly technical stuff from work. In everyday life I don’t encounter enough of these problems to really illustrate this.
Again, this isn’t really about Bayes vs Frequentism, it’s about little evidence and lots of analysis vs lots of evidence and little analysis. Basically, data beats algorithms, and you should take that seriously in any kind of investigation.
Yeah. I work as a programmer, and it took me a while to learn even if you’re smart, double-checking is so much better than guessing, in so many unexpected ways. Another lesson in the same vein is “write down everything”.
Interesting responses. Given that many of them miss the point, the point about a lack of examples is well taken. It is evidently impossible to communicate without using examples.
Now that’s a hypothesis with only a bit of evidence and not much confidence, and the heuristic I’m getting at here would suggest that I really ought to consider collecting a wider sample. Maybe write 10 big comments randomly assigned to be exampleful or not exampleful, and see what actual correlations come up. Note that if I don’t do that, I would have to think about subtle distinctions and effects, and many possibilities, but if I do, there’s no room for such philosophy; the measurements would make it clear with little interpretation required.
And that forms our first example of what I’m trying to get at; when you form a hypothesis, it’s a good idea to immediately think of whether there is an experiment that could disambiguate so far that you could think of it as a simple fact, or alternatively reveal your hypothesis as wrong. This is just the virtue of empiricism, which I previously didn’t take seriously.
Maybe this is only useful to me due to my particular mental state before and after this idea, and because of the work I do. So here’s some examples of the kind of stuff I had in mind to be clear:
Suppose you are designing a zinc-air alkaline fuel cell (as I am) and you see a funny degradation in the voltage over extended time, and a probe voltage wandering around. Preliminary investigations reveal (or do they?) that it’s an ohmic (as opposed to electrochemical) effect in the current transfer parts. The only really serious hypothesis is that there is some kind of contact corrosion due to leaking. Great, we know what it is, let’s rip it apart and rebuild it with some fix for that (solder). “No” says the Crush Your Uncertainty heuristic, “kick it when it’s down, kill all other hypotheses, prove it beyond all suspicion.”
So you do; you take it apart and painstakingly measure the resistance between all points and note the peculiar distribution of resistances very much characteristic to a corrosion issue in the one particular spot. But then oh look, the resistance depends on how hard you push on it (corrosion issue), the pins are corroded, and the contact surface has caustic electrolyte in it. And then while we’re at it, we notice that the corrosion doesn’t correlate with the leaks, and is basically everywhere conditional on any leak, because the nickel current distribution mesh wicks the electrolyte all over the place if it gets anywhere. And you learn a handful of other things (for example, why there are leaks, which was incidentally revealed in the thorough analysis of the contact issue).
...And we rip it apart and rebuild it with some solder. The decision at hand didn’t change, but the information gained killed all uncertainty and enlightened us about other stuff.
So then imagine that you need lots of zinc in a particular processed form to feed your fuel cell, and the guys working that angle are debugging their system so there’s never enough. In conversation it’s revealed that you think there was 3 kg delivered, and they think they delivered 7. (For various reasons you can’t directly measure it.) That’s a pretty serious mistake. On closer analysis with better measurements next time, you both estimate ~12 kg. OK there’s actually no problem; we can move on to other things. “No” says the Crush Your Uncertainty heuristic, “kick it when it’s down.” This time you don’t listen.
...And it comes back to bite you. Turns out it was the “closer” inspection that was wrong (or was it?) Now its late in the game the CEO is looking for someone to blame, and there’s some major problem that you would have revealed earlier if you’d listened to the CYU heuristic.
There’s a bunch of others, mostly technical stuff from work. In everyday life I don’t encounter enough of these problems to really illustrate this.
Again, this isn’t really about Bayes vs Frequentism, it’s about little evidence and lots of analysis vs lots of evidence and little analysis. Basically, data beats algorithms, and you should take that seriously in any kind of investigation.
Yeah. I work as a programmer, and it took me a while to learn even if you’re smart, double-checking is so much better than guessing, in so many unexpected ways. Another lesson in the same vein is “write down everything”.