I think a person who has trained awareness about thier own cortisol levels is likely to have some useful knowledge about cortisol.
They might have hundreds of experiences where they did X and then they noticed their cortisol rising. If you talk with them about stress they might have their own ontology that distinguishes activities in stressful and not-stressful based on whether or not they raise their own cortisol level. I do think that such an ontology provides fruitful knowledge.
A decade ago plenty of psychologists ran around and claimed that willpower is about how much glucose one has in their blood. Professor Rob Baumeister wrote his book Willpower for that thesis. If Baumeister would have worn a device that gave him 24⁄7 information about his glucose levels I think he would have gained knowledge that would have told him that the thesis is wrong.
Yes, I agree that there could be genuine knowledge to be had in such a case. But it seems to me that what it takes to make it genuine knowledge is exactly what the OP here is lamenting the demand for.
Suppose you practice some sort of (let’s say) meditation, and after a while you become inwardly convinced that you are now aware at all times of the level of cortisol in your blood. You now try doing a bunch of things and see which ones lead to a “higher-cortisol experience”. Do you have knowledge about what activities raise and lower cortisol levels yet? I say: no, because as yet you don’t actually know that the thing you think is cortisol-awareness really is cortisol-awareness.
So now you test it. You hook up some sort of equipment that samples your blood and measures cortisol, and you do various things and record your estimates of your cortisol levels, and afterwards you compare them against what the machinery says. And lo, it turns out that you really have developed reliably accurate cortisol-awareness. Now do you have knowledge about what activities raise and lower cortisol levels? Yes, I think you do (with some caveats about just how thoroughly you’ve tested your cortisol-sense; it might turn out that it’s usually good but systematically wrong in some way you didn’t test).
But this scientific evidence that your cortisol-sense really is a cortisol-sense is just what it takes to make appeals to that cortisol-sense no longer seem excessively subjective and unreliable and woo-y to hard-nosed rationalist types.
The specific examples jessicata gives in the OP seem to me to be ones where there isn’t, as yet, that sort of rigorous systematic modernism-friendly science-style evidence that intuition reliably matches reality.
Anyway you do science can turn out to be usually good but systematically wrong in some way you didn’t test. Most placebo-blind studies are build on questionable assumptions about how blinding works.
jessicata does according to their profile work in “decision theory, social epistemology, strategy, naturalized agency, mathematical foundations, decentralized networking systems and applications, theory of mind, and functional programming languages”.
In a field like theory of mind there’s not knowledge that verified to standards that would satisfy a physicist. The knowledge you can have is less certain. In comparison to the other knowledge sources that are available increasing your ability of self introspection is a good help at building knowledge about the field.
The whole apparatus of science is about reducing the opportunities for being systematically wrong in ways you didn’t test. Sure, it doesn’t always work, but if there’s a better way I don’t think the human race has found it yet.
If knowledge is much harder to come by in domain A than in domain B, you can either accept that you don’t get to claim to know things as often in domain A, or else relax what you mean by “knowledge” when working in domain A. The latter feels better, because knowing things is nice, but I think the former is usually a better strategy. Otherwise there’s too much temptation to start treating things you “know” only in the sense of (say) most people in the field having strong shared intuitions about them in the same way as you treat things you “know” in the sense of having solid experimental evidence despite repeated attempts at refutation.
I think a person who has trained awareness about thier own cortisol levels is likely to have some useful knowledge about cortisol.
They might have hundreds of experiences where they did X and then they noticed their cortisol rising. If you talk with them about stress they might have their own ontology that distinguishes activities in stressful and not-stressful based on whether or not they raise their own cortisol level. I do think that such an ontology provides fruitful knowledge.
A decade ago plenty of psychologists ran around and claimed that willpower is about how much glucose one has in their blood. Professor Rob Baumeister wrote his book Willpower for that thesis. If Baumeister would have worn a device that gave him 24⁄7 information about his glucose levels I think he would have gained knowledge that would have told him that the thesis is wrong.
Yes, I agree that there could be genuine knowledge to be had in such a case. But it seems to me that what it takes to make it genuine knowledge is exactly what the OP here is lamenting the demand for.
Suppose you practice some sort of (let’s say) meditation, and after a while you become inwardly convinced that you are now aware at all times of the level of cortisol in your blood. You now try doing a bunch of things and see which ones lead to a “higher-cortisol experience”. Do you have knowledge about what activities raise and lower cortisol levels yet? I say: no, because as yet you don’t actually know that the thing you think is cortisol-awareness really is cortisol-awareness.
So now you test it. You hook up some sort of equipment that samples your blood and measures cortisol, and you do various things and record your estimates of your cortisol levels, and afterwards you compare them against what the machinery says. And lo, it turns out that you really have developed reliably accurate cortisol-awareness. Now do you have knowledge about what activities raise and lower cortisol levels? Yes, I think you do (with some caveats about just how thoroughly you’ve tested your cortisol-sense; it might turn out that it’s usually good but systematically wrong in some way you didn’t test).
But this scientific evidence that your cortisol-sense really is a cortisol-sense is just what it takes to make appeals to that cortisol-sense no longer seem excessively subjective and unreliable and woo-y to hard-nosed rationalist types.
The specific examples jessicata gives in the OP seem to me to be ones where there isn’t, as yet, that sort of rigorous systematic modernism-friendly science-style evidence that intuition reliably matches reality.
Anyway you do science can turn out to be usually good but systematically wrong in some way you didn’t test. Most placebo-blind studies are build on questionable assumptions about how blinding works.
jessicata does according to their profile work in “decision theory, social epistemology, strategy, naturalized agency, mathematical foundations, decentralized networking systems and applications, theory of mind, and functional programming languages”.
In a field like theory of mind there’s not knowledge that verified to standards that would satisfy a physicist. The knowledge you can have is less certain. In comparison to the other knowledge sources that are available increasing your ability of self introspection is a good help at building knowledge about the field.
The whole apparatus of science is about reducing the opportunities for being systematically wrong in ways you didn’t test. Sure, it doesn’t always work, but if there’s a better way I don’t think the human race has found it yet.
If knowledge is much harder to come by in domain A than in domain B, you can either accept that you don’t get to claim to know things as often in domain A, or else relax what you mean by “knowledge” when working in domain A. The latter feels better, because knowing things is nice, but I think the former is usually a better strategy. Otherwise there’s too much temptation to start treating things you “know” only in the sense of (say) most people in the field having strong shared intuitions about them in the same way as you treat things you “know” in the sense of having solid experimental evidence despite repeated attempts at refutation.