it’s not obvious why being under the control of the puppet master would feel any different.
This is essentially why I posed the question. Anyone who believes they do have free will or is disturbed by the idea that they don’t, ought to be able to say what (at least they think) would feel different without it.
I posit that if such a person tries to describe how they think “lack of free will” would feel, either they won’t be able to do it, or what they describe will be something obviously different from human experience (thereby implicitly redefining “free will” as something non-controversial).
I think Occam’s razor is reason enough to disbelieve the puppet master scenario. I’d readily admit that my idea of free will might be something entirely non-controversial. And i don’t have any problem with the idea that some currently existing machines might already have free will according to my definition (and for others the puppet master scenario is essentially true).
This is essentially why I posed the question. Anyone who believes they do have free will or is disturbed by the idea that they don’t, ought to be able to say what (at least they think) would feel different without it.
I posit that if such a person tries to describe how they think “lack of free will” would feel, either they won’t be able to do it, or what they describe will be something obviously different from human experience (thereby implicitly redefining “free will” as something non-controversial).
I think Occam’s razor is reason enough to disbelieve the puppet master scenario. I’d readily admit that my idea of free will might be something entirely non-controversial. And i don’t have any problem with the idea that some currently existing machines might already have free will according to my definition (and for others the puppet master scenario is essentially true).
Me too. Didn’t mean to imply that I disagreed with your analysis.