Being completely simulated by an external party is an unrealistic scenario for a human, but a very realistic one for an artificial intelligence. I always assumed that was one of the primary reasons for LW’s fascination with Omega problems.
Also, not all Omega problems are equal. As has been pointed out a bazillion of times, Newcomb’s Paradox works just as well if you only assume a good guesser with a consistent better-than-even track record (and indeed, IMO, should be phrased like this from the start, sacrificing simplicity for the sake of conceptual hygiene), so insanity considerations are irrelevant. On the flip-side, Counterfactual Mugging is “solved” by earning the best potential benefit right now as you answer the question, so the likelihood of Omega does have a certain role to play.
Being completely simulated by an external party is an unrealistic scenario for a human, but a very realistic one for an artificial intelligence
Being completely simulated by an external party is a realistic scenario for a human, given that an artificial intelligence exists. This might also be part of the fascination.
Being completely simulated by an external party is an unrealistic scenario for a human, but a very realistic one for an artificial intelligence. I always assumed that was one of the primary reasons for LW’s fascination with Omega problems.
Also, not all Omega problems are equal. As has been pointed out a bazillion of times, Newcomb’s Paradox works just as well if you only assume a good guesser with a consistent better-than-even track record (and indeed, IMO, should be phrased like this from the start, sacrificing simplicity for the sake of conceptual hygiene), so insanity considerations are irrelevant. On the flip-side, Counterfactual Mugging is “solved” by earning the best potential benefit right now as you answer the question, so the likelihood of Omega does have a certain role to play.
Being completely simulated by an external party is a realistic scenario for a human, given that an artificial intelligence exists. This might also be part of the fascination.
And also conditional on the frequent LW belief that AIs become gods, but yeah.