which works well for the central example (mentally competent human in Western culture), but fails at the boundaries (unusual cultures, mental disorders, non-human animals, algorithms). Hence my original question.
He further elaborates
There—now my sensation of freedom indicates something coherent; and most of the time, I will have no reason to doubt the sensation’s veracity. I have no problems about saying that I have “free will” appropriately defined; so long as I am out of jail, uncertain of my own future decision, and living in a lawful universe that gave me emotions and morals whose interaction determines my choices.
Yet, in the next paragraph he states
Certainly I do not “lack free will” if that means I am in jail, or never uncertain of my future decisions, or in a brain-state where my emotions and morals fail to determine my actions in the usual way.
which seems to me to contradict the one before, as it expands the definition to include every possible human mind-state.
I do not recall him giving an example of a mind state which is clearly marked as “no free will”.
Certainly I do not “lack free will” if that means I am in jail, or never uncertain of my future decisions, or in a brain-state where my emotions and morals fail to determine my actions in the usual way.
which seems to me to contradict the one before, as it expands the definition to include every possible human mind-state.
I don’t know what E’s sentence is doing there, to the point that I suspect it’s been garbled by an editing error. But I don’t see why “having free will” should not include pretty much all mind states, short of being asleep or abnormalities such as drug addiction. The phenomenon he is pointing to, whatever its name, is something that human minds do.
I’m very unclear on your question, and where you think the contradiction lies. Being addicted to a drug that you will reliably seek despite considering it wrong would reduce your “free will,” as it would take you closer to being “never uncertain of my future decisions, or in a brain-state where my emotions and morals fail to determine my actions in the usual way.”
(I would personally not have included the “uncertain” part before encountering Eliezer’s work, but of course other writers do treat it as important.)
Eliezer uses a compatibilist definition
which works well for the central example (mentally competent human in Western culture), but fails at the boundaries (unusual cultures, mental disorders, non-human animals, algorithms). Hence my original question.
He further elaborates
Yet, in the next paragraph he states
which seems to me to contradict the one before, as it expands the definition to include every possible human mind-state.
I do not recall him giving an example of a mind state which is clearly marked as “no free will”.
I don’t know what E’s sentence is doing there, to the point that I suspect it’s been garbled by an editing error. But I don’t see why “having free will” should not include pretty much all mind states, short of being asleep or abnormalities such as drug addiction. The phenomenon he is pointing to, whatever its name, is something that human minds do.
I’m very unclear on your question, and where you think the contradiction lies. Being addicted to a drug that you will reliably seek despite considering it wrong would reduce your “free will,” as it would take you closer to being “never uncertain of my future decisions, or in a brain-state where my emotions and morals fail to determine my actions in the usual way.”
(I would personally not have included the “uncertain” part before encountering Eliezer’s work, but of course other writers do treat it as important.)
Tried to clarify my question again.