You assume a good way of assessing existential risk even exists. How difficult is it to accept that it doesn’t? It is irrational to deny existence of unknown unknowns.
It’s quite likely that a few more existential risks will get decent estimates the way asteroid impacts did, but there’s no reason to expect it to be typical, and it will most likely be serendipitous.
No, I don’t assume that there’s a good way. I’m assuming only that we will either act or not act, and therefore we will find that we have decided between action and inaction one way or another whether we like it or not, so I’m asking for the third time, how shall we make that decision?
Using some embarrassingly bad reasoning, self-serving lies, and inertia—the way we make all decisions as a society. We will devote unreasonable amount of resources to risks that aren’t serious, and stay entirely unaware of the most dangerous risks. No matter which decision procedure we’ll take—this will be the result.
What evasions? I thought I’ve clearly stated that I view your decision procedure as pretty much “make up a bunch of random number, multiply and compare”.
Improvement would be to skip this rationality theater and admit we don’t have a clue.
Right, so how shall we assess whether these risks are worth addressing?
You assume a good way of assessing existential risk even exists. How difficult is it to accept that it doesn’t? It is irrational to deny existence of unknown unknowns.
It’s quite likely that a few more existential risks will get decent estimates the way asteroid impacts did, but there’s no reason to expect it to be typical, and it will most likely be serendipitous.
No, I don’t assume that there’s a good way. I’m assuming only that we will either act or not act, and therefore we will find that we have decided between action and inaction one way or another whether we like it or not, so I’m asking for the third time, how shall we make that decision?
Using some embarrassingly bad reasoning, self-serving lies, and inertia—the way we make all decisions as a society. We will devote unreasonable amount of resources to risks that aren’t serious, and stay entirely unaware of the most dangerous risks. No matter which decision procedure we’ll take—this will be the result.
It is clear from your repeated evasions that you have no proposal to improve on the decision procedure I propose.
What evasions? I thought I’ve clearly stated that I view your decision procedure as pretty much “make up a bunch of random number, multiply and compare”.
Improvement would be to skip this rationality theater and admit we don’t have a clue.
AND THEN DECIDE HOW?
By tossing a coin or using Ouija board? None of alternatives proposed has better track record.