There are certainly different schools of thought possible on how much time to invest in disease identification before going for a treatment, but can you explain your evidence for why you think most doctors tend to err on the side of over-caution?
Medicine does include the ideas of “empirical treatment” and “empirical diagnosis”. Empirical treatment is when eg a doctor can’t figure out exactly what a disease is, but it looks bacterial, so ey’ll throw some common antibiotics at it and see if it works. Empirical diagnosis is when a doctor isn’t sure about a particular diagnosis, so ey gives the treatment for that diagnosis and then if the patient gets a bit better ey’s convinced and moves on to more serious long-term treatment.
These are useful but still not as good as knowing what you’re doing. First of all, a lot of the time they complicate the picture enough to make real diagnosis impossible; an organism that could have been cultured and identified before starting empirical antibiotics might be decimated enough to become unidentifiable afterwards, but not so dead that it can’t bounce back and cause a relapse, so starting empirical treatment can be dangerous if you think you’re ever going to change your mind and want to figure out exactly what’s wrong.
And a lot of time, the treatment for one disease will make another similar disease worse. Steroids are the best treatment for a lot of auto-immune conditions, but they will make infectious conditions worse; immune conditions can look like infectious conditions if you haven’t investigated them properly; this is especially true in rashes, which some doctors prescribe steroids for almost automatically. With rashes it’s not the end of the world because even if a rash gets worse you probably won’t die from it, but it still annoys patients and there are some other conditions where you don’t have that margin of error.
There’s also just the common sense problem of people getting really angry if they’ve been told they have a disease, they’ve changed their life around to help live with that disease, they’ve been taking medicine for that disease which may have all sorts of side effects, and then they go to a second doctor and that doctor tells them they don’t in fact have the disease. I saw a lady the other day who for eleven years thought she had multiple sclerosis, switched neurologists, and then the new neurologist told her she didn’t have it—that’s an extreme and very rare example and I have a feeling the error is with the new guy rather than the old guy, but even if so that level of uncertainty and confusion is still traumatic and is a good reason for doctors to work really hard to get a firm diagnosis before giving it.
Maybe I’m missing your point; if so can you give an example of what you mean?
There are certainly different schools of thought possible on how much time to invest in disease identification before going for a treatment, but can you explain your evidence for why you think most doctors tend to err on the side of over-caution?
Medicine does include the ideas of “empirical treatment” and “empirical diagnosis”. Empirical treatment is when eg a doctor can’t figure out exactly what a disease is, but it looks bacterial, so ey’ll throw some common antibiotics at it and see if it works. Empirical diagnosis is when a doctor isn’t sure about a particular diagnosis, so ey gives the treatment for that diagnosis and then if the patient gets a bit better ey’s convinced and moves on to more serious long-term treatment.
These are useful but still not as good as knowing what you’re doing. First of all, a lot of the time they complicate the picture enough to make real diagnosis impossible; an organism that could have been cultured and identified before starting empirical antibiotics might be decimated enough to become unidentifiable afterwards, but not so dead that it can’t bounce back and cause a relapse, so starting empirical treatment can be dangerous if you think you’re ever going to change your mind and want to figure out exactly what’s wrong.
And a lot of time, the treatment for one disease will make another similar disease worse. Steroids are the best treatment for a lot of auto-immune conditions, but they will make infectious conditions worse; immune conditions can look like infectious conditions if you haven’t investigated them properly; this is especially true in rashes, which some doctors prescribe steroids for almost automatically. With rashes it’s not the end of the world because even if a rash gets worse you probably won’t die from it, but it still annoys patients and there are some other conditions where you don’t have that margin of error.
There’s also just the common sense problem of people getting really angry if they’ve been told they have a disease, they’ve changed their life around to help live with that disease, they’ve been taking medicine for that disease which may have all sorts of side effects, and then they go to a second doctor and that doctor tells them they don’t in fact have the disease. I saw a lady the other day who for eleven years thought she had multiple sclerosis, switched neurologists, and then the new neurologist told her she didn’t have it—that’s an extreme and very rare example and I have a feeling the error is with the new guy rather than the old guy, but even if so that level of uncertainty and confusion is still traumatic and is a good reason for doctors to work really hard to get a firm diagnosis before giving it.
Maybe I’m missing your point; if so can you give an example of what you mean?