Some people look at the magnitude of the probability and ignore the disutility (“Sure, it would be awful, but it’s not going to happen, so who cares?”).
It seems rather difficult to actually affect those people, though. The difference between P1=.04 and P1=.08 would have dramatic effects on an EV-calculator, but very little effect on the sort of person who judges probabilities by ‘feel’.
That is, if I start out with a P1 confidence that I will be arrested and convicted for commiting a crime, a P2 confidence that if convicted I will receive significant prison time, and a >.99 confidence that the disutility of significant prison time is D1, and you want to double my expected disutility of commiting that crime, you can double P1, or P2, or D1, or mix-and-match.
I would suppose the D1 advocates would argue that the hidden costs of increasing P1 are higher than you think, or possibly they just value them more (e.g. the right to privacy). I admit I’ve never heard a good argument that what the US needs is to greatly increase the likelihood of sentencing a convict to significant prison time.
The difference between P1=.04 and P1=.08 would have dramatic effects on an EV-calculator, but very little effect on the sort of person who judges probabilities by ‘feel’.
I would expect it depends a lot on the algorithms underlying “feel” and what aspects of the environment they depend on. It’s unlikely these people are choosing their behaviors or beliefs at random, after all.
More generally, if I actually want to manipulate the behavior of a group, I should expect that a good first step is to understand how their behavior depends on aspects of their environment, since often their environment is what I can actually manipulate.
Edit: I should add to this that I certainly agree that it’s possible in principle for a system to be in a state where the most cost-effective thing to do to achieve deterrence is increase D. I just don’t think it’s necessarily true, and am skeptical that the U.S. is currently in such a state.
the hidden costs of increasing P1 are higher than you think
Sure, that’s another possibility. Or of P2, come to that.
I admit I’ve never heard a good argument that what the US needs is to greatly increase the likelihood of sentencing a convict to significant prison time.
Is this not the rationale behind mandatory sentencing laws?
I admit I’ve never heard a good argument that what the US needs is to greatly increase the likelihood of sentencing a convict to significant prison time.
Is this not the rationale behind mandatory sentencing laws?
I can’t think of a response to this that isn’t threatening to devolve into a political argument, so I’ll bow out here. Sorry.
It seems rather difficult to actually affect those people, though. The difference between P1=.04 and P1=.08 would have dramatic effects on an EV-calculator, but very little effect on the sort of person who judges probabilities by ‘feel’.
I would suppose the D1 advocates would argue that the hidden costs of increasing P1 are higher than you think, or possibly they just value them more (e.g. the right to privacy). I admit I’ve never heard a good argument that what the US needs is to greatly increase the likelihood of sentencing a convict to significant prison time.
I would expect it depends a lot on the algorithms underlying “feel” and what aspects of the environment they depend on. It’s unlikely these people are choosing their behaviors or beliefs at random, after all.
More generally, if I actually want to manipulate the behavior of a group, I should expect that a good first step is to understand how their behavior depends on aspects of their environment, since often their environment is what I can actually manipulate.
Edit: I should add to this that I certainly agree that it’s possible in principle for a system to be in a state where the most cost-effective thing to do to achieve deterrence is increase D. I just don’t think it’s necessarily true, and am skeptical that the U.S. is currently in such a state.
Sure, that’s another possibility. Or of P2, come to that.
Is this not the rationale behind mandatory sentencing laws?
I can’t think of a response to this that isn’t threatening to devolve into a political argument, so I’ll bow out here. Sorry.