But x-risk isn’t a concentrated cost, it’s a distributed infinite cost.
This depends on how ‘altruistic’ your values are. For some people, the total value to them of all other humans (ever) is less than the value to them of their own life, and so something that risks blowing up the Earth reads similarly to their decision-making process as something that risks just blowing up themselves. And sometimes, one or both of those values are negative. [As a smaller example, consider the pilots who commit suicide by crashing their plane into the ground—at least once with 150 passengers in the back!]
That said, I made a simple transposition error, it’s supposed to be “concentrated benefits and distributed costs.”
This depends on how ‘altruistic’ your values are. For some people, the total value to them of all other humans (ever) is less than the value to them of their own life, and so something that risks blowing up the Earth reads similarly to their decision-making process as something that risks just blowing up themselves. And sometimes, one or both of those values are negative. [As a smaller example, consider the pilots who commit suicide by crashing their plane into the ground—at least once with 150 passengers in the back!]
That said, I made a simple transposition error, it’s supposed to be “concentrated benefits and distributed costs.”