I believe that the AI does care about your beliefs, just not specific beliefs. The AI only cares about if your decision theory falls into the class of decision theories that will pick two boxes, and if it does then it punishes you. Sure, unlike Dick Kick’em the AI isn’t looking for specific theories just any theory within a specific class, but it is still the same thing. The AI is punishing the agent by putting in less money based soley on your beliefs. In Newcomb’s paradox, the AI scans your brain BEFORE you take any action whatsoever, the punishment cannot be based on your actions, the punishment from the AI is based only on your beliefs. This is exactly the same as the Dick Kick’em Paradox; Dick will punish you purely on your beliefs, not any action. The only difference is that in Newcomb’s paradox you get to play a little game after the AI has punished you.
Eh, Omega only cares about your beliefs insofar as they affect your actions (past, present, or future, it’s all just a different t coordinate). I still think that seems way more natural and common than caring about beliefs in general.
Example: Agent A goes around making death threats, saying to people: “Give me $200 or I’m going to kill you.” Agent B goes around handing out brochures that criticize the government. If the police arrest agent A, that’s probably a reasonable decision. If the police arrest agent B, that’s bad and authoritarian, since it goes against freedom of expression. This is true even though all either agent has done is say things. Agent A hasn’t actually killed anyone yet. But the police still arrest agent A because they care about agent A’s future actions.
“How dare you infer my future actions from what I merely say,” cries agent A as they’re being handcuffed, “you’re arbitrarily punishing me for what I believe. This is a crass violation of my right to freedom of speech.” The door of the police car slams shut and further commentary is inaudible.
Omega only cares about your beliefs insofar as they affect your actions
So does Dick Kick’em, since he only cares about distinct decision theories that a particular agent believes in, and that in turn decides the agent’s actions.
I believe that the AI does care about your beliefs, just not specific beliefs. The AI only cares about if your decision theory falls into the class of decision theories that will pick two boxes, and if it does then it punishes you. Sure, unlike Dick Kick’em the AI isn’t looking for specific theories just any theory within a specific class, but it is still the same thing. The AI is punishing the agent by putting in less money based soley on your beliefs. In Newcomb’s paradox, the AI scans your brain BEFORE you take any action whatsoever, the punishment cannot be based on your actions, the punishment from the AI is based only on your beliefs. This is exactly the same as the Dick Kick’em Paradox; Dick will punish you purely on your beliefs, not any action. The only difference is that in Newcomb’s paradox you get to play a little game after the AI has punished you.
Eh, Omega only cares about your beliefs insofar as they affect your actions (past, present, or future, it’s all just a different t coordinate). I still think that seems way more natural and common than caring about beliefs in general.
Example: Agent A goes around making death threats, saying to people: “Give me $200 or I’m going to kill you.” Agent B goes around handing out brochures that criticize the government. If the police arrest agent A, that’s probably a reasonable decision. If the police arrest agent B, that’s bad and authoritarian, since it goes against freedom of expression. This is true even though all either agent has done is say things. Agent A hasn’t actually killed anyone yet. But the police still arrest agent A because they care about agent A’s future actions.
“How dare you infer my future actions from what I merely say,” cries agent A as they’re being handcuffed, “you’re arbitrarily punishing me for what I believe. This is a crass violation of my right to freedom of speech.” The door of the police car slams shut and further commentary is inaudible.
So does Dick Kick’em, since he only cares about distinct decision theories that a particular agent believes in, and that in turn decides the agent’s actions.