Serious, non-rhetorical question: what’s the basis of your preference? Anything more than just affinity for your species?
I’m not 100% sure what you mean by parasite removal… I guess you’re referring to bad decision-makers, or bad decision-making processes? If so, I think existential risks are interlinked with parasite removal: the latter causes or at least hastens the former. Therefore, to truly address existential risks, you need to address parasite removal.
If I live forever, through cryonics or a positive intelligence explosion before my death, I’d like to have a lot of people to hang around with. Additionally, the people you’d be helping through EA aren’t the people who are fucking up the world at the moment. Plus there isn’t really anything directly important to me outside of humanity.
Parasite removal refers to removing literal parasites from people in the third world, as an example of one of the effective charitable causes you could donate to.
EA? (Sorry to ask, but it’s not in the Less Wrong jargon glossary and I haven’t been here in a while.)
Parasite removal refers to removing literal parasites from people in the third world
Oh. Yes. I think that’s important too, and it actually pulls on my heart strings much more than existential risks that are potentially far in the future, but I would like to try to avoid hyperbolic discounting and try to focus on the most important issue facing humanity sans cognitive bias. But since human motivation isn’t flawless, I may end up focusing on something more immediate. Not sure yet.
Serious, non-rhetorical question: what’s the basis of your preference? Anything more than just affinity for your species?
I’m not 100% sure what you mean by parasite removal… I guess you’re referring to bad decision-makers, or bad decision-making processes? If so, I think existential risks are interlinked with parasite removal: the latter causes or at least hastens the former. Therefore, to truly address existential risks, you need to address parasite removal.
If I live forever, through cryonics or a positive intelligence explosion before my death, I’d like to have a lot of people to hang around with. Additionally, the people you’d be helping through EA aren’t the people who are fucking up the world at the moment. Plus there isn’t really anything directly important to me outside of humanity.
Parasite removal refers to removing literal parasites from people in the third world, as an example of one of the effective charitable causes you could donate to.
EA? (Sorry to ask, but it’s not in the Less Wrong jargon glossary and I haven’t been here in a while.)
Oh. Yes. I think that’s important too, and it actually pulls on my heart strings much more than existential risks that are potentially far in the future, but I would like to try to avoid hyperbolic discounting and try to focus on the most important issue facing humanity sans cognitive bias. But since human motivation isn’t flawless, I may end up focusing on something more immediate. Not sure yet.
EA is Effective Altruism.
Ah, thanks. :)