The impulses of Charles and Alex are not qualitatively different, but this is a case of a quantitative difference being big enough that it amounts to a qualitative one!
Most of us have biologically determined impulses to do this or that, but also the (biologically determined) capacity to override these impulses in cases where it would be inappropriate to give in to them. For instance, we can hold a full bladder for some time if we have more important matters we should deal with - and we can give a fully reductionist account of the terms can and should in such a sentence.
In the cases of Charles and Alex, the underlying intuition is that we could “rewind the tape” a thousand or a million times to the initial conditions (that is, after they acquired the tumor but before they committed whatever act we regard as criminal) and, butterfly effect or not, they would still commit their crimes: their behavior is overwhelmingly determined by an identifiable causal factor, the tumor.
In more “ordinary” criminal cases, we imagine that there has been a “struggle of conscience” in the persons concerned that could have gone either way; that is, if you “rewound the tape” to a point in time after the person acquired whatever dispositions are considered relevant but before they committed their crime, you would find that in some significant percentage of these hypothetical scenarios (or “possible worlds”) their “conscience” won out and they refrained from the crimes in question, after taking into account facts known to them such as the severity of penalties should they be caught, the harm caused to others, and so on.
One of the factors influencing behavior in such cases, possibly enough to swing the decision, is the individual’s awareness of his society’s system of penalties and norms for calculating harms, so it makes sense to make this judicial system as clear, as legible, as well-known as possible so that it becomes the deciding factor in a greater number of such “struggles of conscience”. But it’s unreasonable to have this expectation in the case of a Charles or an Alex.
What is one to make of the metaphor “struggle of conscience” after a reductionist account of free will? I find most of the standard LW take on free will very persuasive, but it doesn’t seem to dissolve the subjective sense that I have of myself as being a brain that sometimes makes an ‘effort’ to choose one option over another.
When I decided what college to go to, there was no struggle of conscience—I was simply curious about which option would be more fun and more productive for me, and striving to sort through all of the relevant evidence to maximize my chances of correctly identifying that option.
On the other hand, when I, e.g., decide whether to hit the snooze button, there is a blatant struggle of conscience—I am not curious at all about whether it would be more fun and productive for me to wake up; I know to a virtual certainty that it would in fact be more fun and productive for me to wake up, and yet it still requires a prolonged and intense effort, as if against some internal enemy, to accomplish the task. What is all of this, in reductionist terms?
The impulses of Charles and Alex are not qualitatively different, but this is a case of a quantitative difference being big enough that it amounts to a qualitative one!
Most of us have biologically determined impulses to do this or that, but also the (biologically determined) capacity to override these impulses in cases where it would be inappropriate to give in to them. For instance, we can hold a full bladder for some time if we have more important matters we should deal with - and we can give a fully reductionist account of the terms can and should in such a sentence.
In the cases of Charles and Alex, the underlying intuition is that we could “rewind the tape” a thousand or a million times to the initial conditions (that is, after they acquired the tumor but before they committed whatever act we regard as criminal) and, butterfly effect or not, they would still commit their crimes: their behavior is overwhelmingly determined by an identifiable causal factor, the tumor.
In more “ordinary” criminal cases, we imagine that there has been a “struggle of conscience” in the persons concerned that could have gone either way; that is, if you “rewound the tape” to a point in time after the person acquired whatever dispositions are considered relevant but before they committed their crime, you would find that in some significant percentage of these hypothetical scenarios (or “possible worlds”) their “conscience” won out and they refrained from the crimes in question, after taking into account facts known to them such as the severity of penalties should they be caught, the harm caused to others, and so on.
One of the factors influencing behavior in such cases, possibly enough to swing the decision, is the individual’s awareness of his society’s system of penalties and norms for calculating harms, so it makes sense to make this judicial system as clear, as legible, as well-known as possible so that it becomes the deciding factor in a greater number of such “struggles of conscience”. But it’s unreasonable to have this expectation in the case of a Charles or an Alex.
This seems very general: it can cover things like coercion, “stealing bread”, and acting out of moral principle. Elegant.
What is one to make of the metaphor “struggle of conscience” after a reductionist account of free will? I find most of the standard LW take on free will very persuasive, but it doesn’t seem to dissolve the subjective sense that I have of myself as being a brain that sometimes makes an ‘effort’ to choose one option over another.
When I decided what college to go to, there was no struggle of conscience—I was simply curious about which option would be more fun and more productive for me, and striving to sort through all of the relevant evidence to maximize my chances of correctly identifying that option.
On the other hand, when I, e.g., decide whether to hit the snooze button, there is a blatant struggle of conscience—I am not curious at all about whether it would be more fun and productive for me to wake up; I know to a virtual certainty that it would in fact be more fun and productive for me to wake up, and yet it still requires a prolonged and intense effort, as if against some internal enemy, to accomplish the task. What is all of this, in reductionist terms?