Do we define a moral dilemma as something where you are not punished for making the wrong choice? As if you are it is more of a calculation for your own profit.
In my personal life I encounter almost none, since there would be almost always some kind of a punishment, at least people thinking I am an asshole and be less willing to help me in the future and this makes them not a purely moral dilemma.
I have a hunch that moral dilemmas are “meant” to be more political. Like should we allow factory farming of animals.
Also they are for people with more interesting jobs such as docs.
I think the legal system of the first world is pretty much tied down so much that a normal mundane citizen rarely encounters purely moral dilemmas. Usually, if it is dubious it is not allowed.
Therefore, moral dilemmas are handled at law-making, hence at voting. They are political.
For example Climate Change / AGW is a huge moral dilemma for me. I tend to lean towards the skeptics being more right, because the alarmists are talking taking action in the last dramatic minute for 20 years now. The alarmists look a lot like the usual suspects of anti-industrialist hippies. But do I really dare to gamble with this politically? It would be safer to act as if the alarmists are right. My feelings about a bunch of kumbaya hippies are less important than not making the planet almost inhabitable and if there is only 1% chance the whole alarmist case is right, despite their many problems, we should be working more on cutting CO2...
because the alarmists are talking taking action in the last dramatic minute for 20 years now.
I’m not understanding your argument here.
If the intended conclusion is ”… so they’re probably wrong”, I just don’t get it at all. I mean, I don’t think anyone[1] ever claimed “we have to fix this now or we’ll all be boiled alive in 15 years”.
If the intended conclusion is ”… so they’re probably insincere”, I kinda-sorta get it but it seems wrong. If you think you’ve discovered something that requires urgent action, and the people with the power to take that action keep on not doing it, of course you’re going to keep saying “look, we have to do this, it’s urgent”. No?
[1] Meaning serious climate scientists, of course. The Day After Tomorrow was not a documentary.
Hansen certainly does. Has he made predictions of the kind I said I didn’t know of anyone making? I skimmed through his Wikipedia page and didn’t find anything so extreme (though he’s said things that are extreme in other ways).
If the intended conclusion is ”… so they’re probably wrong”, I just don’t get it at all. I mean, I don’t think anyone[1] ever claimed “we have to fix this now or we’ll all be boiled alive in 15 years”.
Actually the models in the 1990′s have predicted rather dire consequences for 2015.
and the people with the power to take that action keep on not doing it, of course you’re going to keep saying “look, we have to do this, it’s urgent”. No?
At some point it is too late to avoid the disaster and better start preparing for it. That would be my point. Anyone who predicts do X now or disaster will happen in 20 years and then X is not done loses a lot of cred when they still advocate X. They should be more like saying okay the disaster is now unavoidable and better start preparing for it.
Actually the models in the 1990s have predicted rather dire consequences for 2015.
Interesting. Examples?
At some point it is too late to avoid the disaster and better start preparing for it.
Probably true, though probably the sequence actually goes: disaster avoidable → disaster unavoidable but severity can be mitigated → disaster unavoidable and unmitigable, time to prepare → too late for anything, we’re screwed. And I’d have thought that second phase might be quite prolonged.
Anyone who predicts do X now or disaster will happen in 20 years and then X is not done loses a lot of cred when they still advocate X.
Only if X is only worth doing if done immediately. What reason is there to think that’s the situation here?
Imagine the following super-crude model of climate change. In year 0, we discover that from year 50 onwards the temperature is going to rise by 0.2 degrees (Celsius) per year. There is a drastic action we can take to stop this; if we do this in year Y, the warming will stop in year Y+50. In year 100, regardless, the whole thing will magically stabilize at whatever temperature is reached then.
In this model, if we do nothing then from year 100 onwards the temperature is going to be 10 degrees hotter than now, which it’s fair to say will screw a lot of things up very badly. In fact, just doing nothing for 20 years guarantees 4 degrees of temperature rise, which is probably enough to be pretty catastrophic. So the alarmists say: “We must take action within 20 years or it’ll be a disaster!”.
OK, so now it’s 20 years on and no one has done anything yet. We have 4 degrees of temperature rise ahead of us, whatever we do. But the right thing to say isn’t “OK, disaster is unavoidable, let’s just prepare to cope with it” because the magnitude of the disaster is still open. If we take action now in year 20, we only have 4 degrees of temperature rise to cope with. If we give up on stopping the warming and switch to disaster preparation, we have to prepare for 10 degrees of temperature rise, which is much worse.
(And without the cutoff in year 100, if we give up and switch to disaster preparation then the disaster we have to prepare for is the near-certain extinction of the human race within a few centuries.)
For the avoidance of doubt, I am not putting this forward as an accurate account of the actual climate change situation! But it seems to me to have a lot of features in common—possible disaster ahead, considerable lag between action and eventual consequences, taking action sooner means smaller effect. And in my toy model, it seems very clear that a sensible and sincere “alarmist” will both (1) say “disaster ahead if we don’t act really soon” and (2) continue saying that for a long time as no action continues to be taken. Which is exactly what you’re saying they shouldn’t be saying in the real world. What are the relevant differences that make your inference a good one in the real world and not in my toy example?
Do we define a moral dilemma as something where you are not punished for making the wrong choice? As if you are it is more of a calculation for your own profit.
In my personal life I encounter almost none, since there would be almost always some kind of a punishment, at least people thinking I am an asshole and be less willing to help me in the future and this makes them not a purely moral dilemma.
I have a hunch that moral dilemmas are “meant” to be more political. Like should we allow factory farming of animals.
Also they are for people with more interesting jobs such as docs.
I think the legal system of the first world is pretty much tied down so much that a normal mundane citizen rarely encounters purely moral dilemmas. Usually, if it is dubious it is not allowed.
Therefore, moral dilemmas are handled at law-making, hence at voting. They are political.
For example Climate Change / AGW is a huge moral dilemma for me. I tend to lean towards the skeptics being more right, because the alarmists are talking taking action in the last dramatic minute for 20 years now. The alarmists look a lot like the usual suspects of anti-industrialist hippies. But do I really dare to gamble with this politically? It would be safer to act as if the alarmists are right. My feelings about a bunch of kumbaya hippies are less important than not making the planet almost inhabitable and if there is only 1% chance the whole alarmist case is right, despite their many problems, we should be working more on cutting CO2...
I’m not understanding your argument here.
If the intended conclusion is ”… so they’re probably wrong”, I just don’t get it at all. I mean, I don’t think anyone[1] ever claimed “we have to fix this now or we’ll all be boiled alive in 15 years”.
If the intended conclusion is ”… so they’re probably insincere”, I kinda-sorta get it but it seems wrong. If you think you’ve discovered something that requires urgent action, and the people with the power to take that action keep on not doing it, of course you’re going to keep saying “look, we have to do this, it’s urgent”. No?
[1] Meaning serious climate scientists, of course. The Day After Tomorrow was not a documentary.
That seems like a nice No True Scotsman prologue :-) Do people like James Hansen qualify?
Hansen certainly does. Has he made predictions of the kind I said I didn’t know of anyone making? I skimmed through his Wikipedia page and didn’t find anything so extreme (though he’s said things that are extreme in other ways).
Actually the models in the 1990′s have predicted rather dire consequences for 2015.
At some point it is too late to avoid the disaster and better start preparing for it. That would be my point. Anyone who predicts do X now or disaster will happen in 20 years and then X is not done loses a lot of cred when they still advocate X. They should be more like saying okay the disaster is now unavoidable and better start preparing for it.
Interesting. Examples?
Probably true, though probably the sequence actually goes: disaster avoidable → disaster unavoidable but severity can be mitigated → disaster unavoidable and unmitigable, time to prepare → too late for anything, we’re screwed. And I’d have thought that second phase might be quite prolonged.
Only if X is only worth doing if done immediately. What reason is there to think that’s the situation here?
Imagine the following super-crude model of climate change. In year 0, we discover that from year 50 onwards the temperature is going to rise by 0.2 degrees (Celsius) per year. There is a drastic action we can take to stop this; if we do this in year Y, the warming will stop in year Y+50. In year 100, regardless, the whole thing will magically stabilize at whatever temperature is reached then.
In this model, if we do nothing then from year 100 onwards the temperature is going to be 10 degrees hotter than now, which it’s fair to say will screw a lot of things up very badly. In fact, just doing nothing for 20 years guarantees 4 degrees of temperature rise, which is probably enough to be pretty catastrophic. So the alarmists say: “We must take action within 20 years or it’ll be a disaster!”.
OK, so now it’s 20 years on and no one has done anything yet. We have 4 degrees of temperature rise ahead of us, whatever we do. But the right thing to say isn’t “OK, disaster is unavoidable, let’s just prepare to cope with it” because the magnitude of the disaster is still open. If we take action now in year 20, we only have 4 degrees of temperature rise to cope with. If we give up on stopping the warming and switch to disaster preparation, we have to prepare for 10 degrees of temperature rise, which is much worse.
(And without the cutoff in year 100, if we give up and switch to disaster preparation then the disaster we have to prepare for is the near-certain extinction of the human race within a few centuries.)
For the avoidance of doubt, I am not putting this forward as an accurate account of the actual climate change situation! But it seems to me to have a lot of features in common—possible disaster ahead, considerable lag between action and eventual consequences, taking action sooner means smaller effect. And in my toy model, it seems very clear that a sensible and sincere “alarmist” will both (1) say “disaster ahead if we don’t act really soon” and (2) continue saying that for a long time as no action continues to be taken. Which is exactly what you’re saying they shouldn’t be saying in the real world. What are the relevant differences that make your inference a good one in the real world and not in my toy example?