As I understand it, medical school is very heavy on memorization—doctors are expected to absorb a huge quantity of data about procedures, medications, etc. So much memorization, in fact, that it’s probably incompatible with a habit of model-checking everything. If that’s true, then this would filter out the traits that medicine needs most to repair its practices.
I’m pretty sure it doeesn’t take 4 years of “whatever you feel like majoring in” in university before even starting med school, just to learn an angiogram classifier that a computer can probably outperform you on anyway.
At least, the English lit classes could probably be scrapped …
When it comes to medicine, if computers CAN outperform the average person, we should probably be using computers anyway. (Yeah, that maxim applies to most professions, but most professions don’t have lives on the line).
When it comes to medicine, if computers CAN outperform the average person, we should probably be using computers anyway.
Hey, if you want to get doctors to step out of the way on the grounds that a computer really can trounce their expert judgment, even in just a few domains … well, you’re going up against a lot of resistance.
Well, yeah. That will definitely be an issue, if not now then in a few years. But I also wouldn’t be surprised if this was a genuinely difficult task, and I don’t know that statements like “a computer can probably outperform you” are justifiable to throw around, unless you actually know that computers DO have a better track record at the task in question.
I hadn’t read it myself, but I remember that there are a lot of stories like these in Ian Ayers’s Supercrunchers and the success of simple algorithms over expert judgment and the resistance thereto. And Superfreakonomics mentioned the story of how hard it was to get doctors to wash their hands as often as necessary.
Double-scary when you think about how much education one must (legally) get before becoming a cardiologist.
There’s a huge amount of waste somewhere in all that.
As I understand it, medical school is very heavy on memorization—doctors are expected to absorb a huge quantity of data about procedures, medications, etc. So much memorization, in fact, that it’s probably incompatible with a habit of model-checking everything. If that’s true, then this would filter out the traits that medicine needs most to repair its practices.
Or the field really is that confusing, which wouldn’t surprise me too much. You’re dealing with variables that are constantly changing.
I’m pretty sure it doeesn’t take 4 years of “whatever you feel like majoring in” in university before even starting med school, just to learn an angiogram classifier that a computer can probably outperform you on anyway.
At least, the English lit classes could probably be scrapped …
When it comes to medicine, if computers CAN outperform the average person, we should probably be using computers anyway. (Yeah, that maxim applies to most professions, but most professions don’t have lives on the line).
Hey, if you want to get doctors to step out of the way on the grounds that a computer really can trounce their expert judgment, even in just a few domains … well, you’re going up against a lot of resistance.
Well, yeah. That will definitely be an issue, if not now then in a few years. But I also wouldn’t be surprised if this was a genuinely difficult task, and I don’t know that statements like “a computer can probably outperform you” are justifiable to throw around, unless you actually know that computers DO have a better track record at the task in question.
I hadn’t read it myself, but I remember that there are a lot of stories like these in Ian Ayers’s Supercrunchers and the success of simple algorithms over expert judgment and the resistance thereto. And Superfreakonomics mentioned the story of how hard it was to get doctors to wash their hands as often as necessary.