I don’t think I ever had this confused concept of free will. That is thinking that the future of my actions is undetermined until I make a decision or that my actions are governed by anything other than normal physics never made any sense to me at all.
To me possessing a free will means being in principle capable of being the causal bottleneck of my decisions other than through pure chance.
Making a decision means caching the result of a mental calculation about whether to take a certain course of action (which in humans has the strong psychological consequence of affirming that result).
Being the causal bottleneck is much more difficult to define than I thought when I started this post, but it involves comparing what sort of change to me would result in a different decision to what sort of changes to the rest of the world would result in the same.
The only ways I could see not having a free will would be either not being able to make decisions at all, or not being able to make decisions unless under the influence of something else that is itself the causal bottleneck of the decision, and which is not part of me. I can’t see how the second could be the case without some sort of puppet master (and there has to be some reason against concluding that this puppet master is the real me), but it’s not obvious why being under the control of the puppet master would feel any different.
it’s not obvious why being under the control of the puppet master would feel any different.
This is essentially why I posed the question. Anyone who believes they do have free will or is disturbed by the idea that they don’t, ought to be able to say what (at least they think) would feel different without it.
I posit that if such a person tries to describe how they think “lack of free will” would feel, either they won’t be able to do it, or what they describe will be something obviously different from human experience (thereby implicitly redefining “free will” as something non-controversial).
I think Occam’s razor is reason enough to disbelieve the puppet master scenario. I’d readily admit that my idea of free will might be something entirely non-controversial. And i don’t have any problem with the idea that some currently existing machines might already have free will according to my definition (and for others the puppet master scenario is essentially true).
I don’t think I ever had this confused concept of free will. That is thinking that the future of my actions is undetermined until I make a decision or that my actions are governed by anything other than normal physics never made any sense to me at all.
To me possessing a free will means being in principle capable of being the causal bottleneck of my decisions other than through pure chance.
Making a decision means caching the result of a mental calculation about whether to take a certain course of action (which in humans has the strong psychological consequence of affirming that result).
Being the causal bottleneck is much more difficult to define than I thought when I started this post, but it involves comparing what sort of change to me would result in a different decision to what sort of changes to the rest of the world would result in the same.
The only ways I could see not having a free will would be either not being able to make decisions at all, or not being able to make decisions unless under the influence of something else that is itself the causal bottleneck of the decision, and which is not part of me. I can’t see how the second could be the case without some sort of puppet master (and there has to be some reason against concluding that this puppet master is the real me), but it’s not obvious why being under the control of the puppet master would feel any different.
This is essentially why I posed the question. Anyone who believes they do have free will or is disturbed by the idea that they don’t, ought to be able to say what (at least they think) would feel different without it.
I posit that if such a person tries to describe how they think “lack of free will” would feel, either they won’t be able to do it, or what they describe will be something obviously different from human experience (thereby implicitly redefining “free will” as something non-controversial).
I think Occam’s razor is reason enough to disbelieve the puppet master scenario. I’d readily admit that my idea of free will might be something entirely non-controversial. And i don’t have any problem with the idea that some currently existing machines might already have free will according to my definition (and for others the puppet master scenario is essentially true).
Me too. Didn’t mean to imply that I disagreed with your analysis.