I think it’s pretty clear that scientific conclusions can be dangerous in the sense that telling everybody about them is dangerous. For example, the possibility of nuclear weapons. On the other hand, there should probably be an ethical injunction against deciding what kind of science other people get to do. (But in return maybe scientists themselves should think more carefully about whether what they’re doing is going to kill the human race or not.)
That’s the thing, the science wasn’t good or bad, it was the to decision to give the results to certain people that held that quality of good/bad. And it was very, very bad. But the process of looking at the world, wondering how it works, then figuring out how it works, and then making it work the way you desire, that process carries with it no intrinsic moral qualities.
But the process of looking at the world, wondering how it works, then figuring out how it works, and then making it work the way you desire, that process carries with it no intrinsic moral qualities.
I don’t know what you mean by “intrinsic” moral qualities (is this to be contrasted with “extrinsic” moral qualities, and should I care less about the latter or what?). What I’m saying is just that the decision to pursue some scientific research has bad consequences (whether or not you intend to publicize it: doing it increases the probability that it will get publicized one way or another).
The majority of scientific discoveries (I’m tempted to say all but I’m 90% certain that there exist at least one counter example) have very good consequences as well as bad. I think the good and bad actually usually go hand in hand.
To make the obvious example nuclear research lead to both the creation of nuclear weapons but also the creation of nuclear energy.
At what point could you label research into any scientific field as having to many negative consequences to pursue?
General complaint: sometimes when I say that people should be doing a certain thing, someone responds that doing that thing requires answering hard questions. I don’t know what bringing this point up is supposed to accomplish. Yes, many things worth doing require answering hard questions. That is not a compelling reason not to do them.
I do not ask it because I wanted to stop the discussion by asking a hard question. I ask it because I aspire to do research into physics and will someday need an answer to it. As such I have been very curious about different arguments to this question. By no means did I mean by asking this question that there are things that should not be research simply how to go about finding them?
Remove any confusions you might have about metaethics, figure out what it is you value, estimate what kind of impact the research you want to do will have with respect to what you value, estimate what kind of impact the other things you could do will have with respect to what you value, pick the thing that is more valuable.
Trying to retroactively judge previous research this way is difficult because the relevant quantity you want to estimate is not the observed net value of a given piece of research (which is hard enough to estimate) but the expected net value at the time the decision was being made to do the research. I think the expected value of research into nuclear physics in the past was highly negative because of how much it increased the probability of nuclear war, but I’m not a domain expert and can’t give hard numbers to back up this assertion.
I’m reading through all of the sequences (slowly, it takes a while to truly understand and I started in 2012) and by coincidence I happen to be at the beginning of metaethics currently. Until I finish I won’t argue any further on this subject due to being confused. Thanks for help
At one point, physicists thought detonating even one nuclear bomb might set fire to the atmosphere.
This was taken seriously, and disproven before one in fact was detonated, but it’s not clear that the tests wouldn’t have gone ahead even if the verdict had come back with merely “unlikely”.
In the current day biologists, computer scientists and physicists are all working on devices which could be far more dangerous than nuclear weapons. In this case the danger is well known, but no-one high-status enough to succeed is seriously proposing a moratorium on research. To be fair, we’ve still got some time to go.
I think it’s pretty clear that scientific conclusions can be dangerous in the sense that telling everybody about them is dangerous. For example, the possibility of nuclear weapons. On the other hand, there should probably be an ethical injunction against deciding what kind of science other people get to do. (But in return maybe scientists themselves should think more carefully about whether what they’re doing is going to kill the human race or not.)
That’s the thing, the science wasn’t good or bad, it was the to decision to give the results to certain people that held that quality of good/bad. And it was very, very bad. But the process of looking at the world, wondering how it works, then figuring out how it works, and then making it work the way you desire, that process carries with it no intrinsic moral qualities.
I don’t know what you mean by “intrinsic” moral qualities (is this to be contrasted with “extrinsic” moral qualities, and should I care less about the latter or what?). What I’m saying is just that the decision to pursue some scientific research has bad consequences (whether or not you intend to publicize it: doing it increases the probability that it will get publicized one way or another).
The majority of scientific discoveries (I’m tempted to say all but I’m 90% certain that there exist at least one counter example) have very good consequences as well as bad. I think the good and bad actually usually go hand in hand.
To make the obvious example nuclear research lead to both the creation of nuclear weapons but also the creation of nuclear energy.
At what point could you label research into any scientific field as having to many negative consequences to pursue?
I agree that this is a hard question.
General complaint: sometimes when I say that people should be doing a certain thing, someone responds that doing that thing requires answering hard questions. I don’t know what bringing this point up is supposed to accomplish. Yes, many things worth doing require answering hard questions. That is not a compelling reason not to do them.
I do not ask it because I wanted to stop the discussion by asking a hard question. I ask it because I aspire to do research into physics and will someday need an answer to it. As such I have been very curious about different arguments to this question. By no means did I mean by asking this question that there are things that should not be research simply how to go about finding them?
Remove any confusions you might have about metaethics, figure out what it is you value, estimate what kind of impact the research you want to do will have with respect to what you value, estimate what kind of impact the other things you could do will have with respect to what you value, pick the thing that is more valuable.
Trying to retroactively judge previous research this way is difficult because the relevant quantity you want to estimate is not the observed net value of a given piece of research (which is hard enough to estimate) but the expected net value at the time the decision was being made to do the research. I think the expected value of research into nuclear physics in the past was highly negative because of how much it increased the probability of nuclear war, but I’m not a domain expert and can’t give hard numbers to back up this assertion.
I’m reading through all of the sequences (slowly, it takes a while to truly understand and I started in 2012) and by coincidence I happen to be at the beginning of metaethics currently. Until I finish I won’t argue any further on this subject due to being confused. Thanks for help
I think nuclear weapons have a chance of killing a large number of people but are very unlikely to kill the human race.
At one point, physicists thought detonating even one nuclear bomb might set fire to the atmosphere.
This was taken seriously, and disproven before one in fact was detonated, but it’s not clear that the tests wouldn’t have gone ahead even if the verdict had come back with merely “unlikely”.
In the current day biologists, computer scientists and physicists are all working on devices which could be far more dangerous than nuclear weapons. In this case the danger is well known, but no-one high-status enough to succeed is seriously proposing a moratorium on research. To be fair, we’ve still got some time to go.