Epistemic status is not a measure of confidence but of reasons for confidence. “It’s written in a textbook” is an epistemic status of a claim but does not say how confident I’m in a claim.
When people start their posts with “Epistemic status: …” they usually are not listing probabilities.
I tend to disagree, although I could be wrong. My epistemic status for the claim is at 65% ;). I have not spent as much time on lesswrong so it wouldn’t surprise me.
Before posting that comment I double checked with this article. Here is a quote which the author got from urban dictionary (the author said the definition is accurate, and from my experience I agreed):
The epistemic status is a short disclaimer at the top of a post that explains how confident the author is in the contents of the post, how much reputation the author is willing to stake on it, what sorts of tests the thesis has passed.
It should give a reader a sense of how seriously they should take the post.
I do agree with you that people do not typically list probabilities when listing their epistemic status, although I have seen at least several people list probabilities in their epistemic status section. It is not unheard of, and I still think it is good practice for especially important claims.
You are not wrong that the epistemic status often includes reasons for claims, but an epistemic status usually includes the level of confidence in a claim.
Of course, it’s just a definition, so it’s not the end of the world if we disagree over it. It’s not a crux 🤷♂️.
The linked article has a bunch of examples in the section “A bunch of examples”. None of those have any probabilities attached to it. The post does not list “your confidence” in the section “Valuable information to list in an epistemic status”.
Some of them do have words that are about confidence but even a status like “Pretty confident. But also, enthusiasm on the verge of partisanship” is more than just the author sharing their confidence. That was the first in the list.
The second is: “I have worked for 1 year in a junior role at a large consulting company. Most security experts have much more knowledge regarding the culture and what matters in information security. My experiences are based on a sample size of five projects, each with different clients. It is therefore quite plausible that consulting in information security is very different from what I experienced. Feedback from a handful of other consultants supports my views.”
This statement makes it transparent to the reader, why the author believes what they believe. It’s up for the reader to come up with how confident they are in what they author is saying based on that statement. That’s similar to how the information in about the partisanship that was communicate in the first case is about transparency so that the reader can make better decision instead of just trusting the author.
The linked article has a bunch of examples in the section “A bunch of examples”. None of those have any probabilities attached to it. The post does not list “your confidence” in the section “Valuable information to list in an epistemic status”.
Let me expand the quote I gave from before to add context
“The epistemic status is a short disclaimer at the top of a post that explains how confident the author is in the contents of the post, how much reputation the author is willing to stake on it, what sorts of tests the thesis has passed.
It should give a reader a sense of how seriously they should take the post.”
I think that’s a surprisingly good explanation. Commenters might be able to add to it, in which case I’ll add an elaboration.
It’s a “surprisingly good explanation”, but does not including everything. The author did not mention attaching confidence in the epistemic status again because it was already covered, and they were merely adding points that were not covered by the definition (their “elaboration” as they put it).
You are right that none of the examples attach probabilites to the claims, but that does not mean attaching probabilities is incompatible with a good epistemic status. As far as I know that is not a controversial position, but if I am wrong about that then I will keep that in mind next time when explaining it to others. My intuition was that explicitly mentioning the probability attached to your beliefs is a good methodological tenet of reasoning transparency, and is also a practice of superforcasters. Mentioning the probability also helps maintain a culture of embracing uncertainty.
Some of them do have words that are about confidence but even a status like “Pretty confident. But also, enthusiasm on the verge of partisanship” is more than just the author sharing their confidence. That was the first in the list.
The second is: “I have worked for 1 year in a junior role at a large consulting company. Most security experts have much more knowledge regarding the culture and what matters in information security. My experiences are based on a sample size of five projects, each with different clients. It is therefore quite plausible that consulting in information security is very different from what I experienced. Feedback from a handful of other consultants supports my views.”
This statement makes it transparent to the reader, why the author believes what they believe. It’s up for the reader to come up with how confident they are in what they author is saying based on that statement. That’s similar to how the information in about the partisanship that was communicate in the first case is about transparency so that the reader can make better decision instead of just trusting the author.
You’re right. An epistemic status should and usually does include information relevant to helping the reader judge for themselves whether the person’s reasons are credible, but 10⁄12 of the examples from that post mention their level of confidence and uncertainty in the claims they make, including the two examples you just gave. Language like “quite plausible that consulting in information security is very different from what I experienced.” is an explicit mention of the level of confidence.
I agree that we should include everything you mention, but disagree that we should leave out mentioning confidence, or that it is not important. I want the person I am reading to be calibrated with the relevant probabilities. In other words, if they believe something is 25% likely, it should happen 25% of the time. If this is not the case then something is wrong with their models of the world.
Epistemic status is not a measure of confidence but of reasons for confidence. “It’s written in a textbook” is an epistemic status of a claim but does not say how confident I’m in a claim.
When people start their posts with “Epistemic status: …” they usually are not listing probabilities.
I tend to disagree, although I could be wrong. My epistemic status for the claim is at 65% ;). I have not spent as much time on lesswrong so it wouldn’t surprise me.
Before posting that comment I double checked with this article. Here is a quote which the author got from urban dictionary (the author said the definition is accurate, and from my experience I agreed):
I do agree with you that people do not typically list probabilities when listing their epistemic status, although I have seen at least several people list probabilities in their epistemic status section. It is not unheard of, and I still think it is good practice for especially important claims.
You are not wrong that the epistemic status often includes reasons for claims, but an epistemic status usually includes the level of confidence in a claim.
Of course, it’s just a definition, so it’s not the end of the world if we disagree over it. It’s not a crux 🤷♂️.
The linked article has a bunch of examples in the section “A bunch of examples”. None of those have any probabilities attached to it. The post does not list “your confidence” in the section “Valuable information to list in an epistemic status”.
Some of them do have words that are about confidence but even a status like “Pretty confident. But also, enthusiasm on the verge of partisanship” is more than just the author sharing their confidence. That was the first in the list.
The second is: “I have worked for 1 year in a junior role at a large consulting company. Most security experts have much more knowledge regarding the culture and what matters in information security. My experiences are based on a sample size of five projects, each with different clients. It is therefore quite plausible that consulting in information security is very different from what I experienced. Feedback from a handful of other consultants supports my views.”
This statement makes it transparent to the reader, why the author believes what they believe. It’s up for the reader to come up with how confident they are in what they author is saying based on that statement. That’s similar to how the information in about the partisanship that was communicate in the first case is about transparency so that the reader can make better decision instead of just trusting the author.
The act of sharing the information with the reader is an act of respecting the agency of the reader. In debates like https://www.lesswrong.com/posts/u9a8RFtsxXwKaxWAa/why-i-quit-effective-altruism-and-why-timothy-telleen-lawton respecting the epistemic agency of other people is a big deal.
Let me expand the quote I gave from before to add context
It’s a “surprisingly good explanation”, but does not including everything. The author did not mention attaching confidence in the epistemic status again because it was already covered, and they were merely adding points that were not covered by the definition (their “elaboration” as they put it).
You are right that none of the examples attach probabilites to the claims, but that does not mean attaching probabilities is incompatible with a good epistemic status. As far as I know that is not a controversial position, but if I am wrong about that then I will keep that in mind next time when explaining it to others. My intuition was that explicitly mentioning the probability attached to your beliefs is a good methodological tenet of reasoning transparency, and is also a practice of superforcasters. Mentioning the probability also helps maintain a culture of embracing uncertainty.
You’re right. An epistemic status should and usually does include information relevant to helping the reader judge for themselves whether the person’s reasons are credible, but 10⁄12 of the examples from that post mention their level of confidence and uncertainty in the claims they make, including the two examples you just gave. Language like “quite plausible that consulting in information security is very different from what I experienced.” is an explicit mention of the level of confidence.
I agree that we should include everything you mention, but disagree that we should leave out mentioning confidence, or that it is not important. I want the person I am reading to be calibrated with the relevant probabilities. In other words, if they believe something is 25% likely, it should happen 25% of the time. If this is not the case then something is wrong with their models of the world.
Again, nothing wrong with sharing information. I agree with almost everything you write here.