Zombie-me’s are the replicas of me in alternate worlds. They aren’t under my conscious control, thus they’re “zombies” from my perspective.
Except, in my understanding, they are created every time I make a choice, in proportion to the probability and I would choose Y over X. That is, if there’s a 91% chance that I’d choose X, then in 91% of the worlds the zombie-me’s have chosen X and in the remaining 9% they’d chosen Y.
Again, caveat: I don’t think physics and probability were meant to be interpreted this way.
Your views on free will sound suspiciously as though you’ve derived them from “The Fabric Of Reality”. Like Deutsch, you don’t seem to appreciate that this isn’t actually a response to the ‘problem of free will’ as generally understood, because it’s inherently incapable of distinguishing free decisions from randomly determined ones, and is silent on questions of moral responsibility.
No, I haven’t. I’ve derived my views entirely from this post, plus the article above.
Since you mentioned “The Fabric Of Reality,” I tried looking it up on Less Wrong, and failing that, found its Wikipedia entry. I know not to judge a book by its Wikipedia page, but I still fail to see the similarity. Please enlighten me if you don’t mind.
The following are statements about my mind-state, not about what is:
I don’t see why my view would be incapable of distinguishing free decisions from randomly determined ones. I’d go with naïve intuition: if I chose X and not Y knowingly, then I better be prepared for X’s logical outcome Z. If I choose X expecting W, then I’m either wrong (and/or stupid), or X is a random choice.
As for moral responsibility, that’s even simpler. If I caused outcome Z in world A, then I’m morally responsible in proportion to my knowledge that Z would happen + some constant. If I pressed a button labeled W not knowing what it does and a building nearby blew up because of it, then my responsibility = some constant. If I pressed X knowing it would blow up a building nearby, then my responsibility > some constant. Better yet, take me to a real-world court. I shouldn’t be any more or less responsible for my actions if this view were correct, than I would’ve in the world as it is understood now.
Zombie-me’s are the replicas of me in alternate worlds. They aren’t under my conscious control, thus they’re “zombies” from my perspective.
Except, in my understanding, they are created every time I make a choice, in proportion to the probability and I would choose Y over X. That is, if there’s a 91% chance that I’d choose X, then in 91% of the worlds the zombie-me’s have chosen X and in the remaining 9% they’d chosen Y.
Again, caveat: I don’t think physics and probability were meant to be interpreted this way.
Your views on free will sound suspiciously as though you’ve derived them from “The Fabric Of Reality”. Like Deutsch, you don’t seem to appreciate that this isn’t actually a response to the ‘problem of free will’ as generally understood, because it’s inherently incapable of distinguishing free decisions from randomly determined ones, and is silent on questions of moral responsibility.
No, I haven’t. I’ve derived my views entirely from this post, plus the article above.
Since you mentioned “The Fabric Of Reality,” I tried looking it up on Less Wrong, and failing that, found its Wikipedia entry. I know not to judge a book by its Wikipedia page, but I still fail to see the similarity. Please enlighten me if you don’t mind.
The following are statements about my mind-state, not about what is:
I don’t see why my view would be incapable of distinguishing free decisions from randomly determined ones. I’d go with naïve intuition: if I chose X and not Y knowingly, then I better be prepared for X’s logical outcome Z. If I choose X expecting W, then I’m either wrong (and/or stupid), or X is a random choice.
As for moral responsibility, that’s even simpler. If I caused outcome Z in world A, then I’m morally responsible in proportion to my knowledge that Z would happen + some constant. If I pressed a button labeled W not knowing what it does and a building nearby blew up because of it, then my responsibility = some constant. If I pressed X knowing it would blow up a building nearby, then my responsibility > some constant. Better yet, take me to a real-world court. I shouldn’t be any more or less responsible for my actions if this view were correct, than I would’ve in the world as it is understood now.
Same goes for all my alternate-world zombies.