This may indeed be a failure mode that new people on teams are prone to, and maybe even something that new people on teams are especially prone to if they’ve read HPMOR, but I don’t think it’s the same as the thing I’m talking about–and in particular this doesn’t sound like me, as a new nurse who’s read HPMOR. I think the analog in nursing would be the new grad who’s carrying journal articles around everywhere, overconfident in their fresh-out-of-school knowledge, citing the new Best Practice Guidelines and nagging all the experienced nurses about not following them. Whereas I’m pretty much always underconfident, trying to watch how the experienced nurses do things and learn for them, asking for help lots, and offering my help to everyone all the time. Which is probably annoying sometimes, but not in the same way.
I think that there is a spirit of heroic responsibility that makes people genuinely stronger, which Eliezer is doing his best to describe in HPMOR, and what you described is very much not in the spirit of heroic responsibility.
That’s a bit of self contradictory statement, isn’t it? (People can be unassertive but internally very overconfident, by the way).
So you have that patient, and you have your idea on the procedures that should have been done, and there’s doctor’s, and you in retrospect think you were under-confident that your treatment plan was superior? What if magically you were in the position where you’d actually have to take charge? Where ordering a wrong procedure hurts the patient? It’s my understanding that there’s a very strong initial bias to order unnecessary procedures, that takes years of experience to overcome.
I suspect it’s one of things that look very different from the inside and from the outside… None of those arrogant newbies would have seen themselves in my description (up until they wisen up). Also, your prototype here is the heroic responsibility for saving the human race, taken upon by someone who neither completed formal education in relevant subjects, nor (which would actually be better to see) produced actual working software products of relevance, nor other things of such nature evaluated to be correct in a way that’s somewhat immune to rationalization. And a straightforwardly responsible thing to do is to try to do more of rationalization-immune things to practice, because the idea is that screwing up here has very bad consequences.
Other issue is that you are essentially thinking meat, and if the activation of the neurons used for responsibility is outside a specific range, things don’t work right, performance is impaired, responsibility too is impaired, etc, whether the activation is too low or too high.
edit: to summarize with an analogy, say, driving a car without having passed a driving test is irresponsible, right? No matter how much you feel that you can drive the bus better than the person who’s legally driving it, the responsible thing to do is to pass a driving test first. Now, the heroes, they don’t need no stinking tests. They jump into the airplane cockpit and they land it just fine, without once asking if there’s a certified pilot on board. In most fiction, heroes are incredibly irresponsible, and the way they take responsibility for things is very irresponsible, but it all works out fine because it’s fiction.
So you have that patient, and you have your idea on the procedures that should have been done, and there’s doctor’s, and you in retrospect think you were under-confident that your treatment plan was superior?
I’m not sure that the doctor and I disagreed on that much. So we had this patient, who weighed 600 pounds and had all the chronic diseases that come with it, and he was having more and more trouble breathing–he was in heart failure, with water backing up into his lungs, basically. Which we were treating with diuretics, but he was already slowly going into kidney failure, and giving someone big doses of diuretics can push them into complete kidney failure, and also can make you deaf–so the doses we were giving him weren’t doing anything, and we couldn’t give him more. Normally it would have been an easy decision to intubate him and put him on a ventilator around Day 3, but at 600 pounds, with all that medical history, if we did that he’d end up in the hospital for six months, with a tracheotomy, all that. So the doctor had a good reason for wanting to delay the inevitable as long as possible. We were also both expecting that he would need dialysis sooner or later...but we couldn’t put him on dialysis to take water off his lungs and avoid having to intubate him, because he was completely confused and delirious and I had enough trouble getting him to keep his oxygen mask on. Dialysis really requires a patient who stays still. We couldn’t give him too many medications to calm him down, because anything with a sedative effect would decrease his respiratory effort, and then he’d end up needed to be intubated.
Basically, it was a problem with so many constraints that there was no good solution. I think that my disagreement with the doctor was over values–specifically, the doctor thought of the scenario where we intubate him and put him on dialysis on Monday as basically equivalent to the scenario where we delay it as long as possible and then end up intubating him on Thursday. Whereas to me, latter, where my patient got to spend four extra days writhing around, confused and in pain and struggling to breathe, was a lot worse. I think nurses are trained to have more empathy and care more about a patient being in pain, and also I was seeing him for twelve hours a day whereas the doctor was seeing him for five minutes. And I was really hoping that there was a course of action no one had thought of that was better...but there wasn’t, at least not one I was able to think of. So the guy suffered for five days, ended up intubated, and is probably still in the hospital.
What if magically you were in the position where you’d actually have to take charge? Where ordering a wrong procedure hurts the patient?
I would be terrified all the time of doing the wrong thing. Maybe even more than I already am. I think as a nurse, I basically have causal power a lot of the time anyway–I point a problem out to the doctor, I suggest “do you want to do X”, he says, “Yeah, X is a good idea.” That’s scary, despite the presence of a back-up filter that will let me know if X is a terrible idea. [And doctors also have a lot of back-up filters: the pharmacy will call them to clarify a medication order that they think is a bad idea, and nurses can and will speak their opinion, and have the right to refuse to administer treatment if they think that it’s unsafe for the patient.]
Well, from your description it may be that doctor has less hyperbolic discounting (due to having worked longer). Being more able to weight the chance of avoiding intrusive procedures and long term hospitalization, which carry huge risks as well as huge amount of total pain over time.
To say that you’re underconfident is to say that you believe you’re correct more often than you believe yourself to be correct. The claim of underconfidence is not a claim underconfident people tend to make. Underconfident people usually don’t muster enough confidence about their tendency to be right to conclude that they’re underconfident.
It’s self-contradictory only in the same way as “I believe a lot of false things” is. (Maybe a closer analogy: “I make a lot of mistakes.”.) In other words, it make a general claim that conflicts with various (unspecified) particular beliefs one has from time to time.
I am generally underconfident. That is: if I look at how sure I am about things (measured by how I feel, what I say, what in some cases how willing I am to take risks based on those opinions), with hindsight it turns out that my confidence is generally too low. In some sense, recognizing this should automatically increase my confidence levels until they stop being too low—but in practice my brain doesn’t work that way. (I repeat: in some sense it should, and that’s the only sense in which saying “I am generally underconfident” is self-contradictory.)
I make a lot of mistakes. That is: if I look at the various things I have from time to time believed to be true, with hindsight it turns out that quite often those beliefs are incorrect. It seems likely that I have a bunch of incorrect current beliefs, but of course I don’t know which ones they are.
(Perhaps I’ve introduced a new inconsistency by saying both “I am generally underconfident” and “I make a lot of mistakes”. As it happens, on the whole I think I haven’t; in any case that’s a red herring.)
Yes, that’s why I said it was a bit self contradictory. The point is, you got to have two confidence levels involved that aren’t consistent with each other one being lower than the other.
This may indeed be a failure mode that new people on teams are prone to, and maybe even something that new people on teams are especially prone to if they’ve read HPMOR, but I don’t think it’s the same as the thing I’m talking about–and in particular this doesn’t sound like me, as a new nurse who’s read HPMOR. I think the analog in nursing would be the new grad who’s carrying journal articles around everywhere, overconfident in their fresh-out-of-school knowledge, citing the new Best Practice Guidelines and nagging all the experienced nurses about not following them. Whereas I’m pretty much always underconfident, trying to watch how the experienced nurses do things and learn for them, asking for help lots, and offering my help to everyone all the time. Which is probably annoying sometimes, but not in the same way.
I think that there is a spirit of heroic responsibility that makes people genuinely stronger, which Eliezer is doing his best to describe in HPMOR, and what you described is very much not in the spirit of heroic responsibility.
That’s a bit of self contradictory statement, isn’t it? (People can be unassertive but internally very overconfident, by the way).
So you have that patient, and you have your idea on the procedures that should have been done, and there’s doctor’s, and you in retrospect think you were under-confident that your treatment plan was superior? What if magically you were in the position where you’d actually have to take charge? Where ordering a wrong procedure hurts the patient? It’s my understanding that there’s a very strong initial bias to order unnecessary procedures, that takes years of experience to overcome.
I suspect it’s one of things that look very different from the inside and from the outside… None of those arrogant newbies would have seen themselves in my description (up until they wisen up). Also, your prototype here is the heroic responsibility for saving the human race, taken upon by someone who neither completed formal education in relevant subjects, nor (which would actually be better to see) produced actual working software products of relevance, nor other things of such nature evaluated to be correct in a way that’s somewhat immune to rationalization. And a straightforwardly responsible thing to do is to try to do more of rationalization-immune things to practice, because the idea is that screwing up here has very bad consequences.
Other issue is that you are essentially thinking meat, and if the activation of the neurons used for responsibility is outside a specific range, things don’t work right, performance is impaired, responsibility too is impaired, etc, whether the activation is too low or too high.
edit: to summarize with an analogy, say, driving a car without having passed a driving test is irresponsible, right? No matter how much you feel that you can drive the bus better than the person who’s legally driving it, the responsible thing to do is to pass a driving test first. Now, the heroes, they don’t need no stinking tests. They jump into the airplane cockpit and they land it just fine, without once asking if there’s a certified pilot on board. In most fiction, heroes are incredibly irresponsible, and the way they take responsibility for things is very irresponsible, but it all works out fine because it’s fiction.
I’m not sure that the doctor and I disagreed on that much. So we had this patient, who weighed 600 pounds and had all the chronic diseases that come with it, and he was having more and more trouble breathing–he was in heart failure, with water backing up into his lungs, basically. Which we were treating with diuretics, but he was already slowly going into kidney failure, and giving someone big doses of diuretics can push them into complete kidney failure, and also can make you deaf–so the doses we were giving him weren’t doing anything, and we couldn’t give him more. Normally it would have been an easy decision to intubate him and put him on a ventilator around Day 3, but at 600 pounds, with all that medical history, if we did that he’d end up in the hospital for six months, with a tracheotomy, all that. So the doctor had a good reason for wanting to delay the inevitable as long as possible. We were also both expecting that he would need dialysis sooner or later...but we couldn’t put him on dialysis to take water off his lungs and avoid having to intubate him, because he was completely confused and delirious and I had enough trouble getting him to keep his oxygen mask on. Dialysis really requires a patient who stays still. We couldn’t give him too many medications to calm him down, because anything with a sedative effect would decrease his respiratory effort, and then he’d end up needed to be intubated.
Basically, it was a problem with so many constraints that there was no good solution. I think that my disagreement with the doctor was over values–specifically, the doctor thought of the scenario where we intubate him and put him on dialysis on Monday as basically equivalent to the scenario where we delay it as long as possible and then end up intubating him on Thursday. Whereas to me, latter, where my patient got to spend four extra days writhing around, confused and in pain and struggling to breathe, was a lot worse. I think nurses are trained to have more empathy and care more about a patient being in pain, and also I was seeing him for twelve hours a day whereas the doctor was seeing him for five minutes. And I was really hoping that there was a course of action no one had thought of that was better...but there wasn’t, at least not one I was able to think of. So the guy suffered for five days, ended up intubated, and is probably still in the hospital.
I would be terrified all the time of doing the wrong thing. Maybe even more than I already am. I think as a nurse, I basically have causal power a lot of the time anyway–I point a problem out to the doctor, I suggest “do you want to do X”, he says, “Yeah, X is a good idea.” That’s scary, despite the presence of a back-up filter that will let me know if X is a terrible idea. [And doctors also have a lot of back-up filters: the pharmacy will call them to clarify a medication order that they think is a bad idea, and nurses can and will speak their opinion, and have the right to refuse to administer treatment if they think that it’s unsafe for the patient.]
Well, from your description it may be that doctor has less hyperbolic discounting (due to having worked longer). Being more able to weight the chance of avoiding intrusive procedures and long term hospitalization, which carry huge risks as well as huge amount of total pain over time.
No, that is an entirely coherent claim for a person to make and not even a particularly implausible one.
To say that you’re underconfident is to say that you believe you’re correct more often than you believe yourself to be correct. The claim of underconfidence is not a claim underconfident people tend to make. Underconfident people usually don’t muster enough confidence about their tendency to be right to conclude that they’re underconfident.
It’s self-contradictory only in the same way as “I believe a lot of false things” is. (Maybe a closer analogy: “I make a lot of mistakes.”.) In other words, it make a general claim that conflicts with various (unspecified) particular beliefs one has from time to time.
I am generally underconfident. That is: if I look at how sure I am about things (measured by how I feel, what I say, what in some cases how willing I am to take risks based on those opinions), with hindsight it turns out that my confidence is generally too low. In some sense, recognizing this should automatically increase my confidence levels until they stop being too low—but in practice my brain doesn’t work that way. (I repeat: in some sense it should, and that’s the only sense in which saying “I am generally underconfident” is self-contradictory.)
I make a lot of mistakes. That is: if I look at the various things I have from time to time believed to be true, with hindsight it turns out that quite often those beliefs are incorrect. It seems likely that I have a bunch of incorrect current beliefs, but of course I don’t know which ones they are.
(Perhaps I’ve introduced a new inconsistency by saying both “I am generally underconfident” and “I make a lot of mistakes”. As it happens, on the whole I think I haven’t; in any case that’s a red herring.)
Yes, that’s why I said it was a bit self contradictory. The point is, you got to have two confidence levels involved that aren’t consistent with each other one being lower than the other.