I find myself wanting to reach for an asymptotic function and mapping most of these infinities back to finite values. I can’t quite swallow assigning a non-finite value to infinite lizard. At some point, I’m not paying any more for more lizard no matter how infinite it gets (which probably means I’d need some super-asymptote that continues working even as infinities get progressively more insane).
I’m largely on board with more good things happening to more people is always better, but I think I’d give up the notion of computing utilions by simple addition before accepting any of the above.
I also reject Pascal’s wager, which is a (comparatively simple) instance of these infinite problems, for reasons that seem like they should generalize, but are hard to articulate. My first stab would be something along the lines of my prior for any given version of heaven existing shrinks at least as fast as the values increase. I think this follows from finite examples, e.g., if someone offers you a wager with a billion-dollar payout, the chances they’re good for it are much less than for a million-dollar payout. Large swaths of the insane results here stem from accepting bizarre wagers at face value; while that’s a useful simplifying assumption for much of philosophy, I think it’s one this topic has outgrown. Absurdity heuristic is a keeper.
I find myself wanting to reach for an asymptotic function and mapping most of these infinities back to finite values. I can’t quite swallow assigning a non-finite value to infinite lizard. At some point, I’m not paying any more for more lizard no matter how infinite it gets (which probably means I’d need some super-asymptote that continues working even as infinities get progressively more insane).
I’m largely on board with more good things happening to more people is always better, but I think I’d give up the notion of computing utilions by simple addition before accepting any of the above.
I also reject Pascal’s wager, which is a (comparatively simple) instance of these infinite problems, for reasons that seem like they should generalize, but are hard to articulate. My first stab would be something along the lines of my prior for any given version of heaven existing shrinks at least as fast as the values increase. I think this follows from finite examples, e.g., if someone offers you a wager with a billion-dollar payout, the chances they’re good for it are much less than for a million-dollar payout. Large swaths of the insane results here stem from accepting bizarre wagers at face value; while that’s a useful simplifying assumption for much of philosophy, I think it’s one this topic has outgrown. Absurdity heuristic is a keeper.