Can you give an example of the kind of usage of “Everett branch” that you consider to be inappropriately inflationary? I cannot think of an example. This either means that I have not cached the undesirable usages or that I disagree about when it is appropriate to use the phrase.
A possibility is that I actually think in terms of abstract decision theoretical concerns, Many Worlds and Big Worlds in general more than average. This would make me more likely to use the phrase as a literal description of thoughts about a scenario than as, say, just an attempt to make a preference sound a little bit cooler.
I would add “Prisoner’s Dilemma” to the list. I’ve seen it used to describe basically any game theoretic scenario with payoffs rather than just one example of how (relative) payoffs could be set up in a symmetrical two person game.
I am loathe to point to a comment a user has actually made, but anything like “I decided to go to grad school because I’m better off in the Everett branch where I have a post-grad degree than the Everett branch in which I don’t.” No, Mr. Example, you are not going to turn into 1/sqrt(2)|grad student> + 1/sqrt(2)|not grad student>. To the extent to which you are able to choose at all, your decision algorithm is deterministic. What you really mean is “I’m better off in the counterfactual where I have a post-grad degree.”
Do people around here say that? I think/hope I would’ve noticed that. There were a few comments like that during the QM sequence but they were corrected, I think.
Looking up the comments I remember, I can find two or three comments that are not quite as bad as the one I made up above, but still seem to confuse Everett branches with counterfactual choices. They’re usually corrected by other users.
To the extent to which you are able to choose at all, your decision algorithm is deterministic.
How do you know that? Because of the illusion of free will? As EY mentioned once or twice in the last rerun, your decision algorithm runs “inside physics”, which includes QM, chaos and other non- or barely deterministic phenomena. In which case your decision to go or not go to the grad school could ultimately be triggered by a quantum measurement splitting the world into two branches. …If you subscribe to MWI, that is.
Right. But if your decision whether or not to go to school really depends on a quantum event with 50% probability, then you’re not choosing to go to school for reasons. (It would be incorrect, in that case, to say “I chose to go to grad school because I knew I’d be better off with a postgraduate degree.”) Instead, one is choosing a mixed strategy. So to the extent that one’s decision is not deterministic, one doesn’t really choose. Similar things could be said for non-quantum chaos.
I believe there’s a relevant article where Eliezer defends the view that determinism is required for (his conception of) free will. EDIT: Ah yes, this one. If you and I disagree, it’s probably merely about the meaning of the word “choose”. In any case, talking about Everett branches when you’re describing the deliberations you go through in making an everyday choice like that is almost certainly mistaken.
Indeed, we can only hope that our deliberations do not trigger Everett branches, or otherwise everything we’ve ever considered doing, has actually been done by a part of ourselves in another universe. Everyone you’ve gotten angry at and thought about killing is actually dead somewhere… and then, the anthropic effects of that...
All game-theoretic scenarios have payoffs… what would it mean not to have payoffs?
For me the Prisoner’s Dilemma consists in three things:
Pareto-inefficient Nash equilibrium, the only Nash equilibrium
symmetry between players
complete but imperfect information
If you get more specific than that, you end up making a distinction between games that are all basically the same (this one has a payoff of 10 if you defect, this one only has a payoff of 2); you also make a big deal out of the fact that a Tragedy of the Commons has multiple players, even though it’s otherwise isomorphic to a Prisoner’s Dilemma.
So here you and I might disagree; maybe I would abstract the concept further than you would. I presume you’re not limiting “Prisoner’s Dilemma” to actual prisoners, because that seems tremendously silly. So how far would you limit it?
But are there really people who go around applying the term “Prisoner’s Dilemma” to things like Stag Hunts or zero-sum games?
So here you and I might disagree; maybe I would abstract the concept further than you would. I presume you’re not limiting “Prisoner’s Dilemma” to actual prisoners, because that seems tremendously silly. So how far would you limit it?
Approximately the same. I wouldn’t use Prisoner’s Dilemma to describe a Tragedy of the Commons myself but would be unlikely to correct it. In some such cases I’d prefer to just use “Newcomblike”, which takes the abstraction a step further (removing the strict necessity for symmetry) but is also overtly an abstraction.
But are there really people who go around applying the term “Prisoner’s Dilemma” to things like Stag Hunts or zero-sum games?
If people are applying “Prisoner’s Dilemma” to zero-sum games, I can see why you’d be annoyed. It clearly shows that they don’t know anything about game theory.
Can you give an example of the kind of usage of “Everett branch” that you consider to be inappropriately inflationary? I cannot think of an example. This either means that I have not cached the undesirable usages or that I disagree about when it is appropriate to use the phrase.
A possibility is that I actually think in terms of abstract decision theoretical concerns, Many Worlds and Big Worlds in general more than average. This would make me more likely to use the phrase as a literal description of thoughts about a scenario than as, say, just an attempt to make a preference sound a little bit cooler.
I would add “Prisoner’s Dilemma” to the list. I’ve seen it used to describe basically any game theoretic scenario with payoffs rather than just one example of how (relative) payoffs could be set up in a symmetrical two person game.
I am loathe to point to a comment a user has actually made, but anything like “I decided to go to grad school because I’m better off in the Everett branch where I have a post-grad degree than the Everett branch in which I don’t.” No, Mr. Example, you are not going to turn into 1/sqrt(2)|grad student> + 1/sqrt(2)|not grad student>. To the extent to which you are able to choose at all, your decision algorithm is deterministic. What you really mean is “I’m better off in the counterfactual where I have a post-grad degree.”
Do people around here say that? I think/hope I would’ve noticed that. There were a few comments like that during the QM sequence but they were corrected, I think.
Looking up the comments I remember, I can find two or three comments that are not quite as bad as the one I made up above, but still seem to confuse Everett branches with counterfactual choices. They’re usually corrected by other users.
How do you know that? Because of the illusion of free will? As EY mentioned once or twice in the last rerun, your decision algorithm runs “inside physics”, which includes QM, chaos and other non- or barely deterministic phenomena. In which case your decision to go or not go to the grad school could ultimately be triggered by a quantum measurement splitting the world into two branches. …If you subscribe to MWI, that is.
Right. But if your decision whether or not to go to school really depends on a quantum event with 50% probability, then you’re not choosing to go to school for reasons. (It would be incorrect, in that case, to say “I chose to go to grad school because I knew I’d be better off with a postgraduate degree.”) Instead, one is choosing a mixed strategy. So to the extent that one’s decision is not deterministic, one doesn’t really choose. Similar things could be said for non-quantum chaos.
I believe there’s a relevant article where Eliezer defends the view that determinism is required for (his conception of) free will. EDIT: Ah yes, this one. If you and I disagree, it’s probably merely about the meaning of the word “choose”. In any case, talking about Everett branches when you’re describing the deliberations you go through in making an everyday choice like that is almost certainly mistaken.
Indeed, we can only hope that our deliberations do not trigger Everett branches, or otherwise everything we’ve ever considered doing, has actually been done by a part of ourselves in another universe. Everyone you’ve gotten angry at and thought about killing is actually dead somewhere… and then, the anthropic effects of that...
I disagree with that in a number of subtle ways.
All game-theoretic scenarios have payoffs… what would it mean not to have payoffs?
For me the Prisoner’s Dilemma consists in three things:
Pareto-inefficient Nash equilibrium, the only Nash equilibrium
symmetry between players
complete but imperfect information
If you get more specific than that, you end up making a distinction between games that are all basically the same (this one has a payoff of 10 if you defect, this one only has a payoff of 2); you also make a big deal out of the fact that a Tragedy of the Commons has multiple players, even though it’s otherwise isomorphic to a Prisoner’s Dilemma.
So here you and I might disagree; maybe I would abstract the concept further than you would. I presume you’re not limiting “Prisoner’s Dilemma” to actual prisoners, because that seems tremendously silly. So how far would you limit it?
But are there really people who go around applying the term “Prisoner’s Dilemma” to things like Stag Hunts or zero-sum games?
Approximately the same. I wouldn’t use Prisoner’s Dilemma to describe a Tragedy of the Commons myself but would be unlikely to correct it. In some such cases I’d prefer to just use “Newcomblike”, which takes the abstraction a step further (removing the strict necessity for symmetry) but is also overtly an abstraction.
Yes.
If people are applying “Prisoner’s Dilemma” to zero-sum games, I can see why you’d be annoyed. It clearly shows that they don’t know anything about game theory.