If you allow a field to develop its instrumental rationality for a while without moralists sticking their noses in, you get something awesome like Schelling, or PUA, or pretty butterflies. If you get stuck discussing morals, you get… nothing much.
You may be on to something here; this may be a very useful heuristic against which to check our moral intuitions.
On the other hand, one still has to be careful: you probably wouldn’t want to encourage people to refine the art of taking over a country as a genocidal dictator, for example.
On the other hand, one still has to be careful: you probably wouldn’t want to encourage people to refine the art of taking over a country as a genocidal dictator, for example.
Although it is interesting to study in theory. For example, in the Art of War, Laws of Power, history itself or computer simulations. Just so long as it doesn’t involve much real world experimentation. :)
Just so long as it doesn’t involve much real world experimentation. :)
But this is the fundamental problem: you don’t want to let the theory in any field get too far ahead of the real world experimentation. If it does, it makes it harder for the people who eventually do good (and ethical) research to have their work integrated properly into the knowledge. And knowledge that is not based on research is likely to be false. So an important question in any field should be “is there some portion of this that can be studied ethically?”
If we “develop its instrumental rationality for a while without moralists sticking their noses in”, we run the risk of letting theories run wild without sufficient evidence [evo-psych, I’m looking at you] or of relying on unethically-obtained (and therefore less-trustworthy) evidence.
How so? When scientists perform studies, they can sometimes benefit (money, job, or simply reputation) by inventing data or otherwise skipping steps in their research. At other times, they can benefit by failing to publish a result when they can benefit by refraining to publish. A scientist who is willing to violate certain ethical principles (lying, cheating, etc) is surely more willing to act unethically in publishing (or declining to publish) their studies.
Possibly more willing. They might be willing to sacrifice moral standards for the sake of furthering human knowledge that they wouldn’t break for personal gain. It would still be evidence of untrustworthiness though.
You may be on to something here; this may be a very useful heuristic against which to check our moral intuitions.
On the other hand, one still has to be careful: you probably wouldn’t want to encourage people to refine the art of taking over a country as a genocidal dictator, for example.
Although it is interesting to study in theory. For example, in the Art of War, Laws of Power, history itself or computer simulations. Just so long as it doesn’t involve much real world experimentation. :)
But this is the fundamental problem: you don’t want to let the theory in any field get too far ahead of the real world experimentation. If it does, it makes it harder for the people who eventually do good (and ethical) research to have their work integrated properly into the knowledge. And knowledge that is not based on research is likely to be false. So an important question in any field should be “is there some portion of this that can be studied ethically?” If we “develop its instrumental rationality for a while without moralists sticking their noses in”, we run the risk of letting theories run wild without sufficient evidence [evo-psych, I’m looking at you] or of relying on unethically-obtained (and therefore less-trustworthy) evidence.
“Unethically obtained evidence is less trustworthy” is the wrongest thing I’ve heard in this whole discussion :-)
How so? When scientists perform studies, they can sometimes benefit (money, job, or simply reputation) by inventing data or otherwise skipping steps in their research. At other times, they can benefit by failing to publish a result when they can benefit by refraining to publish. A scientist who is willing to violate certain ethical principles (lying, cheating, etc) is surely more willing to act unethically in publishing (or declining to publish) their studies.
Possibly more willing. They might be willing to sacrifice moral standards for the sake of furthering human knowledge that they wouldn’t break for personal gain. It would still be evidence of untrustworthiness though.