overconcerned with objective reality, and that we should instead focus on how perceptions are subjective and how we relate to one another.
I hope these people are kept far far away from nuclear plants. And regular factories. And machinery. Actually, far away from any sharp objects would be the best...
Some might hope that people who do not allow such a concern to be tempered by other concerns, perhaps of a social and moral/ethical nature, should be kept as far away as possible from any of these objects.
After all, even J. R. Oppenheimer discarded his scientific detachment upon witnessing the first nuclear explosion, instead he uttered the famous quote: “Now I am become Death, the destroyer of worlds.” (By contrast, a more “rational” person might simply rejoice that his complex calculations predicting that the Earth’s atmosphere would not be burned up in the explosion had been proven correct by experimentation!) And Einstein famously regretted his career as a physicist upon learning of these fateful possibilities, stating that if he had known earlier, he would have chosen to be a watchmaker.
And Einstein famously regretted his career as a physicist upon learning of these fateful possibilities, stating that if he had known earlier, he would have chosen to be a watchmaker.
Some might hope that people who do not allow such a concern to be tempered by other concerns, perhaps of a social and moral/ethical nature, should be kept as far away as possible from any of these objects.
Concern for what’s real and what’s not should NOT be “tempered by other concerns”. I think you’re confused between descriptive and normative, aka between what is and what should be.
Besides, while you may turn away from learning, say, what happens when you get a certain amount of U-235 packed together, other people won’t. And if at some point later they decide to come and take what used to be yours, well...
I think you’re confused between descriptive and normative, aka between what is and what should be.
These notions are intertwined, rather. “Normative” concerns guide the “descriptive” inquiries we choose to undertake, and provide a criteria for what counts as a “successful” inquiry or experiment. Hume stated that reason should be a slave to passions; by contrast, medieval philosophers viewed “rational” inquiry as a slave to theology, with its cosmology (in the anthropological sense, i. e. what is our “basic, foundational picture”, the way we talk about reality?) and morality.
When we forget about these things, we end up with billions being spent in incredibly complicated experiments on supposedly ‘foundational’ particle physics at the LHC—raising existential risks, such as the possibility of creating a black hole, or a ‘strangelet’. Meanwhile we don’t see anything near to the same concern about, say, the animals that are neareast to us in the Hominidae group, many of which are significantly endangered in the wild, despite the obvious potential of knowing so much more about “what it means to be human” by keeping them around and studying them more closely. These are not trivial concerns, as implied by the supposed primacy of the ‘descriptive’. To treat them as such is quite dangerous.
When we forget about these things, we end up with billions being spent in incredibly complicated experiments on supposedly ‘foundational’ particle physics at the LHC—raising existential risks, such as the possibility of creating a black hole, or a ‘strangelet’.
Claims escalated as commissioning of the LHC drew closer, around 2008–2010. The claimed dangers included the production of stable micro black holes and the creation of hypothetical particles called strangelets,[1] and these questions were explored in the media, on the Internet and at times through the courts.
To address these concerns in the context of the LHC, CERN mandated a group of independent scientists to review these scenarios. In a report issued in 2003, they concluded that, like current particle experiments such as the Relativistic Heavy Ion Collider (RHIC), the LHC particle collisions pose no conceivable threat.[2] A second review of the evidence commissioned by CERN was released in 2008. The report, prepared by a group of physicists affiliated to CERN but not involved in the LHC experiments, reaffirmed the safety of the LHC collisions in light of further research conducted since the 2003 assessment.[3][4] It was reviewed and endorsed by a CERN committee of 20 external scientists and by the Executive Committee of the Division of Particles & Fields of the American Physical Society,[5][6] and was later published in the peer-reviewed Journal of Physics G by the UK Institute of Physics, which also endorsed its conclusions.[3][7]
The report ruled out any doomsday scenario at the LHC, noting that the physical conditions and collision events which exist in the LHC, RHIC and other experiments occur naturally and routinely in the universe without hazardous consequences,[3] including ultra-high-energy cosmic rays observed to impact Earth with energies far higher than those in any man-made collider.
“Normative” concerns guide the “descriptive” inquiries we choose to undertake, and provide a criteria for what counts as a “successful” inquiry or experiment.
Normative concerns guide which inquiries we choose to undertake but they do not (or should not) affect the outcome of these inquiries.
Notably, the normative concerns do NOT provide criteria for success. The cases where such has been attempted—e.g. Lysenko and genetics in Soviet Russia—are universally recognized as failures. Richard Feynman had a lotto say about this.
These are not trivial concerns
By which criteria do you divide concerns into “trivial” and not?
Normative concerns guide which inquiries we choose to undertake but they do not (or should not) affect the outcome of these inquiries.
But they also guide what counts as success. If your biology research is aimed at developing new bioweapons, then stumbling upon a cure for cancer does not count as a success.
I hope these people are kept far far away from nuclear plants. And regular factories. And machinery. Actually, far away from any sharp objects would be the best...
Some might hope that people who do not allow such a concern to be tempered by other concerns, perhaps of a social and moral/ethical nature, should be kept as far away as possible from any of these objects.
After all, even J. R. Oppenheimer discarded his scientific detachment upon witnessing the first nuclear explosion, instead he uttered the famous quote: “Now I am become Death, the destroyer of worlds.” (By contrast, a more “rational” person might simply rejoice that his complex calculations predicting that the Earth’s atmosphere would not be burned up in the explosion had been proven correct by experimentation!) And Einstein famously regretted his career as a physicist upon learning of these fateful possibilities, stating that if he had known earlier, he would have chosen to be a watchmaker.
This is a common misattribution:
http://en.wikiquote.org/wiki/Albert_Einstein#Misattributed
Scroll down to “If only I had known, I should have become a watch-maker.”
Concern for what’s real and what’s not should NOT be “tempered by other concerns”. I think you’re confused between descriptive and normative, aka between what is and what should be.
Besides, while you may turn away from learning, say, what happens when you get a certain amount of U-235 packed together, other people won’t. And if at some point later they decide to come and take what used to be yours, well...
These notions are intertwined, rather. “Normative” concerns guide the “descriptive” inquiries we choose to undertake, and provide a criteria for what counts as a “successful” inquiry or experiment. Hume stated that reason should be a slave to passions; by contrast, medieval philosophers viewed “rational” inquiry as a slave to theology, with its cosmology (in the anthropological sense, i. e. what is our “basic, foundational picture”, the way we talk about reality?) and morality.
When we forget about these things, we end up with billions being spent in incredibly complicated experiments on supposedly ‘foundational’ particle physics at the LHC—raising existential risks, such as the possibility of creating a black hole, or a ‘strangelet’. Meanwhile we don’t see anything near to the same concern about, say, the animals that are neareast to us in the Hominidae group, many of which are significantly endangered in the wild, despite the obvious potential of knowing so much more about “what it means to be human” by keeping them around and studying them more closely. These are not trivial concerns, as implied by the supposed primacy of the ‘descriptive’. To treat them as such is quite dangerous.
This is a common misconception, from Safety of high-energy particle collision experiments on wikipedia:
Normative concerns guide which inquiries we choose to undertake but they do not (or should not) affect the outcome of these inquiries.
Notably, the normative concerns do NOT provide criteria for success. The cases where such has been attempted—e.g. Lysenko and genetics in Soviet Russia—are universally recognized as failures. Richard Feynman had a lot to say about this.
By which criteria do you divide concerns into “trivial” and not?
But they also guide what counts as success. If your biology research is aimed at developing new bioweapons, then stumbling upon a cure for cancer does not count as a success.