I was reading quite recently, but I can’t remember where (LessWrong itself?) (ETA: yes, here and on So8res’ blog), someone saying Newcomb-like problems are the rule in social interactions. Every time you deal with someone who is trying to predict what you are going to do and might be better at it than you, you have a Newcomb-like problem. If you just make what seems to you like the obviously better decision, the other person may have anticipated that and made that choice appear deceptively better for you.
“Hey, check out this great offer I received! Of course, these things are scams, but I just can’t see how this one could be bad!”
“Dude, you’re wondering whether you should do exactly what a con artist has asked you to do?”
Now and then some less technically-minded friend will ask my opinion about a piece of dodgy email they received. My answer always begins, “IT’S A SCAM. IT’S ALWAYS A SCAM.”
Newcomb’s Problem reduces the situation to its bare essentials. A decision theory that two-boxes may not be much use for an AGI, or for a person.
I was reading quite recently, but I can’t remember where (LessWrong itself?) (ETA: yes, here and on So8res’ blog), someone saying Newcomb-like problems are the rule in social interactions. Every time you deal with someone who is trying to predict what you are going to do and might be better at it than you, you have a Newcomb-like problem. If you just make what seems to you like the obviously better decision, the other person may have anticipated that and made that choice appear deceptively better for you.
“Hey, check out this great offer I received! Of course, these things are scams, but I just can’t see how this one could be bad!”
“Dude, you’re wondering whether you should do exactly what a con artist has asked you to do?”
Now and then some less technically-minded friend will ask my opinion about a piece of dodgy email they received. My answer always begins, “IT’S A SCAM. IT’S ALWAYS A SCAM.”
Newcomb’s Problem reduces the situation to its bare essentials. A decision theory that two-boxes may not be much use for an AGI, or for a person.