In general, given ethical norms as they currently exist, rather than in a hypothetical universe where everyone is a strict utilitarian, I think the expected returns on such an experiment are unlikely to be worth the reputational costs.
The Tuskegee experiment may have produced some useful data, but it certainly didn’t produce returns on the scale of reducing global syphilis incidence to zero. Likewise, even extensive experimentation on abducted children is unlikely to do so for malaria. The Tuskegee experiment though, is still seen as a black mark on the reputation of medical researchers and the government; I’ve encountered people who, having heard of it, genuinely believed that it, rather than the extremely stringent standards that currently exist for publishable studies, was a more accurate description of the behavior of present researchers. That sort of thing isn’t easy to escape.
Any effective utilitarian must account for the fact that we’re operating in a world which is extremely unforgiving of behavior such as cutting up a healthy hospital visitor to save several in need of organ transplants, and condition their behavior on that knowledge.
For example, Unit 731 proved that the best treatment for frostbite was not rubbing the Limb, which had been the traditional method but immersion in water a bit warmer than 100 degrees, but never mom than 122 degrees.
The cost of this scientific breakthrough was borne by those seized for medical experiments. They were taken outside and left with exposed arms, periodically drenched with water, until a guard decided that frostbite had set in. Testimony From a Japanese officer said this was determined after the “frozen arms, when struck with a short stick, emitted a sound resembling that which a board gives when it is struck.”
I don’t get the impression that those experiments destroyed a lot of trust—nothing compared to the rape of Nanking or Japanese treatment of American prisoners of war.
However, it might be worth noting that that sort of experimentation doesn’t seem to happen to people who are affiliated with the scientists or the government.
Logically, people could volunteer for such experiments and get the same respect that soldiers do, but I don’t know of any real-world examples.
I don’t get the impression that those experiments destroyed a lot of trust—nothing compared to the rape of Nanking or Japanese treatment of American prisoners of war.
It’s hard for experiments to destroy trust when those doing the experiments aren’t trusted anyway because they do other things that are as bad (and often on a larger scale).
Logically, people could volunteer for such experiments and get the same respect that soldiers do, but I don’t know of any real-world examples.
I was going to say that I didn’t think that medical researchers had ever solicited volunteers for experiments which are near certain to produce such traumatic effects, but on second thought, I do recall that some of the early research on the effects of decompression (as experienced by divers) was done by a scientist who solicited volunteers to be subjected to decompression sickness. I believe that some research on the effects of dramatic deceleration was also done similarly.
I have heard of someone who was trying to determine the biomechanics of crucifixion, what part of the forearm the nail goes through and whether suffocation is the actually the main cause of death and so on, who ran some initial tests with medical cadavers, and then with tied-up volunteers, some of whom were disappointed that they weren’t going to have actual nails driven through their wrists. Are extreme masochists under-represented on medical ethics boards?
Actual medical conspiracies, such as the Tuskegee syphilis experiment, probably contribute to public credence in medical conspiracy theories, such as anti-vax or HIV-AIDS denialism, which have a directly detrimental effect on public health.
In a culture of ideal rationalists, you might be better off having a government run lottery where people were randomly selected for participation in medical experiments, with participation on selection being mandatory for any experiment, whatever its effects on the participants, and all experiments being vetted only if their expected returns were more valuable than any negative effect (including loss of time) imposed on the participants. But we’re a species which is instinctively more afraid of sharks than stairs, so for human beings this probably isn’t a good recipe for social harmony.
Not directly, because I don’t think it would be likely to work. I do think that people should be educated in practical applications of utilitarianism (for instance, the importance of efficiency in charity,) but I don’t think that this would be likely to result in widespread approval of such practices.
In the specific case of the Tuskegee experiment, the methodology was not good, and given that treatments were already available, the expected return was not that great, so it’s not a very good example from which to generalize the potential value of studies which would be considered exploitative of the test subjects.
In general, given ethical norms as they currently exist, rather than in a hypothetical universe where everyone is a strict utilitarian, I think the expected returns on such an experiment are unlikely to be worth the reputational costs.
The Tuskegee experiment may have produced some useful data, but it certainly didn’t produce returns on the scale of reducing global syphilis incidence to zero. Likewise, even extensive experimentation on abducted children is unlikely to do so for malaria. The Tuskegee experiment though, is still seen as a black mark on the reputation of medical researchers and the government; I’ve encountered people who, having heard of it, genuinely believed that it, rather than the extremely stringent standards that currently exist for publishable studies, was a more accurate description of the behavior of present researchers. That sort of thing isn’t easy to escape.
Any effective utilitarian must account for the fact that we’re operating in a world which is extremely unforgiving of behavior such as cutting up a healthy hospital visitor to save several in need of organ transplants, and condition their behavior on that knowledge.
Here’s one with actual information gained: Imperial Japanese experimentation about frostbite
The cost of this scientific breakthrough was borne by those seized for medical experiments. They were taken outside and left with exposed arms, periodically drenched with water, until a guard decided that frostbite had set in. Testimony From a Japanese officer said this was determined after the “frozen arms, when struck with a short stick, emitted a sound resembling that which a board gives when it is struck.”
I don’t get the impression that those experiments destroyed a lot of trust—nothing compared to the rape of Nanking or Japanese treatment of American prisoners of war.
However, it might be worth noting that that sort of experimentation doesn’t seem to happen to people who are affiliated with the scientists or the government.
Logically, people could volunteer for such experiments and get the same respect that soldiers do, but I don’t know of any real-world examples.
It’s hard for experiments to destroy trust when those doing the experiments aren’t trusted anyway because they do other things that are as bad (and often on a larger scale).
I was going to say that I didn’t think that medical researchers had ever solicited volunteers for experiments which are near certain to produce such traumatic effects, but on second thought, I do recall that some of the early research on the effects of decompression (as experienced by divers) was done by a scientist who solicited volunteers to be subjected to decompression sickness. I believe that some research on the effects of dramatic deceleration was also done similarly.
I have heard of someone who was trying to determine the biomechanics of crucifixion, what part of the forearm the nail goes through and whether suffocation is the actually the main cause of death and so on, who ran some initial tests with medical cadavers, and then with tied-up volunteers, some of whom were disappointed that they weren’t going to have actual nails driven through their wrists. Are extreme masochists under-represented on medical ethics boards?
Actual medical conspiracies, such as the Tuskegee syphilis experiment, probably contribute to public credence in medical conspiracy theories, such as anti-vax or HIV-AIDS denialism, which have a directly detrimental effect on public health.
Probably.
In a culture of ideal rationalists, you might be better off having a government run lottery where people were randomly selected for participation in medical experiments, with participation on selection being mandatory for any experiment, whatever its effects on the participants, and all experiments being vetted only if their expected returns were more valuable than any negative effect (including loss of time) imposed on the participants. But we’re a species which is instinctively more afraid of sharks than stairs, so for human beings this probably isn’t a good recipe for social harmony.
So would you be in favor of educating people why things like the Tuskegee experiment or human experimentation on abducted children are good things?
Not directly, because I don’t think it would be likely to work. I do think that people should be educated in practical applications of utilitarianism (for instance, the importance of efficiency in charity,) but I don’t think that this would be likely to result in widespread approval of such practices.
In the specific case of the Tuskegee experiment, the methodology was not good, and given that treatments were already available, the expected return was not that great, so it’s not a very good example from which to generalize the potential value of studies which would be considered exploitative of the test subjects.