Ah, right. The good ol’ “the only consistent meaning of ‘free will’ is ‘what humans do’” approach.
However, I think that it IS possible to imagine how it matters if PeopleHaveFreeWill=false (though it’s quite difficult to visualize it from inside—I can only imagine “toning down” the free will by eliminating certain desiderata). Imagine that Laplace’s demon could exist, and it wrote down the story of your life in a book when you were born. Someone else could read the book and know exactly what you do next year. My intuition doesn’t think this sounds like free will.
Or imagine a universe where all your decisions were completely random. That doesn’t sound like free will either, right? But all your (note: my definition of “your,” i.e. “the measured you”) decisions are random, to the extent that a muon could come screaming out of the atmosphere and make your brain misfire at any time.
So if free will is really poorly defined (and it is), then the simple definition that makes sense is “what humans do;” importantly this definition agrees with our intuition that we have free will. However, if our intuition is allowed to speculate a bit more, we can think up scenarios where we might not have free will. But this contradicts the intuition from two sentences ago that we definitely have free will! What I am trying to demonstrate is that there is a problem after all, and it is in the murky way in which our intuition handles the question “does X have free will?” If the problem is really dealt with, we should end up understanding how our intuition works here, at least to a large degree. That’s why I think Yvain’s post is a good model.
New idea: Laplace’s demon slasher movie: I know what you did next summer!
Someone else could read the book and know exactly what you do next year. My intuition doesn’t think this sounds like free will. Or imagine a universe where all your decisions were completely random. That doesn’t sound like free will either, right?
So, you suddenly realise you live in either of those universes and go “oh, well, I have no free will”.
Does that imply anything for you? Do you start behaving any differently? Is there any practical conclusion that you would reach in both of those universes that you wouldn’t in one where you had free will (which shouldn’t exist since you ruled out both determinism and non-determinism, but we’ll allow it since the lack of a counterfactual would also make free will meaningless)? Emphasis on ‘both’ - there are interesting consequences to determinism and non-determinism, but you need free will to be the discriminating factor for the concept to be worth existing.
(As a side note, my “intuitive answers” aren’t the same as yours, but I won’t bring them up since I’m arguing that everyone’s “intuitive answers” are just non-answers to a non-question.)
Well, it would certainly shake up my morality a bit, which would then change my actions. My ideas of punishment and reward would become more utilitarian as I held people less “responsible” for doing good or bad, for example.
However, if you’re asking “what would be different if you’d been living in that universe all along and never found out,” I must admit I can’t think of anything. Wait, nevermind. “The bell inequalities wouldn’t be violated.” Or “fermions wouldn’t be identical particles.” “Arithmetic would be inconsistent.” But it’s possible to imagine “just so” theories that would fit observations without having much free will. I wouldn’t say a Boltzmann brain has free will in the second before it boils away into the plasma.
Still, I think Occam’s razor helps rule that stuff out. I’ll have to think about it more.
Ah, right. The good ol’ “the only consistent meaning of ‘free will’ is ‘what humans do’” approach.
However, I think that it IS possible to imagine how it matters if PeopleHaveFreeWill=false (though it’s quite difficult to visualize it from inside—I can only imagine “toning down” the free will by eliminating certain desiderata). Imagine that Laplace’s demon could exist, and it wrote down the story of your life in a book when you were born. Someone else could read the book and know exactly what you do next year. My intuition doesn’t think this sounds like free will.
Or imagine a universe where all your decisions were completely random. That doesn’t sound like free will either, right? But all your (note: my definition of “your,” i.e. “the measured you”) decisions are random, to the extent that a muon could come screaming out of the atmosphere and make your brain misfire at any time.
So if free will is really poorly defined (and it is), then the simple definition that makes sense is “what humans do;” importantly this definition agrees with our intuition that we have free will. However, if our intuition is allowed to speculate a bit more, we can think up scenarios where we might not have free will. But this contradicts the intuition from two sentences ago that we definitely have free will! What I am trying to demonstrate is that there is a problem after all, and it is in the murky way in which our intuition handles the question “does X have free will?” If the problem is really dealt with, we should end up understanding how our intuition works here, at least to a large degree. That’s why I think Yvain’s post is a good model.
New idea: Laplace’s demon slasher movie: I know what you did next summer!
So, you suddenly realise you live in either of those universes and go “oh, well, I have no free will”.
Does that imply anything for you? Do you start behaving any differently? Is there any practical conclusion that you would reach in both of those universes that you wouldn’t in one where you had free will (which shouldn’t exist since you ruled out both determinism and non-determinism, but we’ll allow it since the lack of a counterfactual would also make free will meaningless)? Emphasis on ‘both’ - there are interesting consequences to determinism and non-determinism, but you need free will to be the discriminating factor for the concept to be worth existing.
(As a side note, my “intuitive answers” aren’t the same as yours, but I won’t bring them up since I’m arguing that everyone’s “intuitive answers” are just non-answers to a non-question.)
Well, it would certainly shake up my morality a bit, which would then change my actions. My ideas of punishment and reward would become more utilitarian as I held people less “responsible” for doing good or bad, for example.
However, if you’re asking “what would be different if you’d been living in that universe all along and never found out,” I must admit I can’t think of anything. Wait, nevermind. “The bell inequalities wouldn’t be violated.” Or “fermions wouldn’t be identical particles.” “Arithmetic would be inconsistent.” But it’s possible to imagine “just so” theories that would fit observations without having much free will. I wouldn’t say a Boltzmann brain has free will in the second before it boils away into the plasma.
Still, I think Occam’s razor helps rule that stuff out. I’ll have to think about it more.