Lesswrong contains a large intersection of people who are interested in x-risk reduction and people who are aware of the Doomsday Argument. Yet these two things seem to be incompatible with each other, so I’m going to ask about the elephant in the room:
What are your stances on the Doomsday Argument? Does it encourage or discourage you from working on x-risks? Is it a significant concern for you at all?
Do most people working on x-risks believe the Doomsday Argument to be flawed?
If not, it seems to me that avoiding astronomical waste is also astronomically unlikely, thus balancing out x-risk reduction to a moderately important issue for humanity at best. From an individual perspective (or altruistic perspective with future discounting), we perhaps should focus on having a good time before inevitable doom? What am I missing?
[Question] Implications of the Doomsday Argument for x-risk reduction
Lesswrong contains a large intersection of people who are interested in x-risk reduction and people who are aware of the Doomsday Argument. Yet these two things seem to be incompatible with each other, so I’m going to ask about the elephant in the room:
What are your stances on the Doomsday Argument? Does it encourage or discourage you from working on x-risks? Is it a significant concern for you at all?
Do most people working on x-risks believe the Doomsday Argument to be flawed?
If not, it seems to me that avoiding astronomical waste is also astronomically unlikely, thus balancing out x-risk reduction to a moderately important issue for humanity at best. From an individual perspective (or altruistic perspective with future discounting), we perhaps should focus on having a good time before inevitable doom? What am I missing?