If I understand it correctly, this is the paradox:
How would you define optimal insurance? You cannot have 100% certainty, so let’s say that optimal insurance means “this thing cannot fail, unless literally the whole society falls apart”.
Sounds good, doesn’t it? Until you realize that this definitions is equivalent to “if this fails, then literally the whole society falls apart”. Which sounds scary.
The question is, how okay it is to put all your eggs in one basket, if doing so increases the expected survival of every individual egg. In addition to straightforward “shut up and multiply”, please consider all the moral hazard this would bring. People are not good at imagining small probabilities, so if before they were okay with e.g. 1% probability of losing one important thing, now they will become okay with 1% probability of losing everything.
If I understand it correctly, this is the paradox:
How would you define optimal insurance? You cannot have 100% certainty, so let’s say that optimal insurance means “this thing cannot fail, unless literally the whole society falls apart”.
Sounds good, doesn’t it? Until you realize that this definitions is equivalent to “if this fails, then literally the whole society falls apart”. Which sounds scary.
The question is, how okay it is to put all your eggs in one basket, if doing so increases the expected survival of every individual egg. In addition to straightforward “shut up and multiply”, please consider all the moral hazard this would bring. People are not good at imagining small probabilities, so if before they were okay with e.g. 1% probability of losing one important thing, now they will become okay with 1% probability of losing everything.