This focuses a bit heavily on Pascal’s mugging and not on existential risks, but since you may have given me an entirely new idea about it and since it also goes into the idea of really good arguments, I think it seems reasonable to put it here.
Previously, I have been thinking of Pascal’s Mugging in terms of a spam filter. Pascal’s Mugging resembles spam, so it should be discarded. However, I’ve thought of an entirely different way to approach Pascal’s Mugging after reading your post and I wanted to post it here for thoughts.
Let’s say someone who looks relatively harmless walks up from out of an alley and says that they will cause a lifetime of torture to ONE person if you don’t give him some small amount of money.
Many people would think “He’s referring to me! Eek, I’m being mugged, and not just mugged but mugged by a crazy guy!” Rational people might run some quick calculations in their head and think giving him the money is usually the rational thing to do. Or maybe they’d think the rational thing to do is to walk away.
Of course, there’s nothing particularly Pascallian about that mugging. That’s basically just a mugging. Let’s call it Mugging 0.
So now consider Mugging 1.
Let’s say someone who looks relatively harmless walks up from out of an alley and says that they will cause a lifetime of torture to THREE people if you don’t give him some small amount of money.
In general, it seems safe to say that you are making slightly different calculations than in Mugging 0. Maybe you’re more likely to give me the money. Maybe you’re less likely to give me the money, but basically, your attempt to calculate the utility changes.
Now, you can take this out to a large number of powers of three: http://www.quadibloc.com/crypto/t3.htm And get Mugging 2 (9 people), Mugging 3 (27 people), Mugging 4 (81 people) etcetera
Which leads into what I’m thinking as a possible new approach.
For most rationalists who are bothered by Pascal’s mugging, for some ranges of numbers you will give money, and for some ranges you won’t give money. (If you will give money regardless of the number or not give money regardless of the number, you probably aren’t the type of person bothered by Pascal’s Mugging.)
As an example on one hand, Let’s say that you personally can’t stand the thought of being responsible for more than 100 lifetimes of torture. You don’t want there to even be a 1 in 1 quadrillion chance of that happening. You might then rationally switch your behavior at Mugging 5 (Where he threatens 243 people) from “Don’t give money” to “Give Money.” because of your utility calculations.
As an example on the other hand, at Mugging 21 there might be a rather large boost in skepticism. That’s 10,460,353,203, which is more people than on earth currently. How is he going to torture that many people? Let’s say 3^21 is a point where it’s rational to flip your behavior from “Give money” to “Don’t give money.”
But what I’m getting at is, if the idea that you should give in to Pascal’s Mugging is rational, it seems like there must be a point or area where it is rational to flip from “Don’t give Money” to “Give Money”, and there must also not be any future points or areas where you switch from “Give Money” to “Don’t give Money.” Which makes my question, at approximately what order of magnitude is this final point or area which represents the smallest rational Pascal’s mugging? It doesn’t have to perfectly accurate, and will possibly vary from person to person anyway, which is why I am expecting something along the lines of a range of orders of magnitude, and not any individual number. If someone were to answer “Well, possibly somewhere between Mugging 40 and Mugging 50, it would be rational to switch to giving money at around there and not switch after that. Even if someone threatened a Mugging 100 with 3^100 lifetimes of torture, there aren’t any new physically expressible rational reasons that apply.”
This focuses a bit heavily on Pascal’s mugging and not on existential risks, but since you may have given me an entirely new idea about it and since it also goes into the idea of really good arguments, I think it seems reasonable to put it here.
Previously, I have been thinking of Pascal’s Mugging in terms of a spam filter. Pascal’s Mugging resembles spam, so it should be discarded. However, I’ve thought of an entirely different way to approach Pascal’s Mugging after reading your post and I wanted to post it here for thoughts.
Let’s say someone who looks relatively harmless walks up from out of an alley and says that they will cause a lifetime of torture to ONE person if you don’t give him some small amount of money.
Many people would think “He’s referring to me! Eek, I’m being mugged, and not just mugged but mugged by a crazy guy!” Rational people might run some quick calculations in their head and think giving him the money is usually the rational thing to do. Or maybe they’d think the rational thing to do is to walk away.
Of course, there’s nothing particularly Pascallian about that mugging. That’s basically just a mugging. Let’s call it Mugging 0.
So now consider Mugging 1.
Let’s say someone who looks relatively harmless walks up from out of an alley and says that they will cause a lifetime of torture to THREE people if you don’t give him some small amount of money.
In general, it seems safe to say that you are making slightly different calculations than in Mugging 0. Maybe you’re more likely to give me the money. Maybe you’re less likely to give me the money, but basically, your attempt to calculate the utility changes.
Now, you can take this out to a large number of powers of three: http://www.quadibloc.com/crypto/t3.htm And get Mugging 2 (9 people), Mugging 3 (27 people), Mugging 4 (81 people) etcetera
Which leads into what I’m thinking as a possible new approach.
For most rationalists who are bothered by Pascal’s mugging, for some ranges of numbers you will give money, and for some ranges you won’t give money. (If you will give money regardless of the number or not give money regardless of the number, you probably aren’t the type of person bothered by Pascal’s Mugging.)
As an example on one hand, Let’s say that you personally can’t stand the thought of being responsible for more than 100 lifetimes of torture. You don’t want there to even be a 1 in 1 quadrillion chance of that happening. You might then rationally switch your behavior at Mugging 5 (Where he threatens 243 people) from “Don’t give money” to “Give Money.” because of your utility calculations.
As an example on the other hand, at Mugging 21 there might be a rather large boost in skepticism. That’s 10,460,353,203, which is more people than on earth currently. How is he going to torture that many people? Let’s say 3^21 is a point where it’s rational to flip your behavior from “Give money” to “Don’t give money.”
But what I’m getting at is, if the idea that you should give in to Pascal’s Mugging is rational, it seems like there must be a point or area where it is rational to flip from “Don’t give Money” to “Give Money”, and there must also not be any future points or areas where you switch from “Give Money” to “Don’t give Money.” Which makes my question, at approximately what order of magnitude is this final point or area which represents the smallest rational Pascal’s mugging? It doesn’t have to perfectly accurate, and will possibly vary from person to person anyway, which is why I am expecting something along the lines of a range of orders of magnitude, and not any individual number. If someone were to answer “Well, possibly somewhere between Mugging 40 and Mugging 50, it would be rational to switch to giving money at around there and not switch after that. Even if someone threatened a Mugging 100 with 3^100 lifetimes of torture, there aren’t any new physically expressible rational reasons that apply.”