In fact, I wish I had never come across the initial link on the internet that caused me to think about transhumanism and thereby about the singularity
As I understand, you donate (and plan to in the future) to existential risk charities, and that is one of the consequences of you having come across that link. How does this compute into net negative, in your estimation, or are you answering a different question?
Sure I want to donate. But if you express it as a hypothetical choice between being a person who didn’t know about any of this and had no way of finding out, versus what I have now, I choose the former. Though since that is not an available choice, it is a somewhat academic question.
But if you express it as a hypothetical choice between being a person who didn’t know about any of this and had no way of finding out, versus what I have now, I choose the former.
I can’t believe to hear this from a person who wrote about Ugh fields. I can’t believe to read a plead for ignorance on a blog devoted to refining rationality. Ignorance is bliss, is that the new motto now?
Well look, one has to do cost/benefit calculations, not just blindly surge forward in some kind of post-enlightenment fervor. To me, it seems like there is only one positive term in the equation:: the altrustic value of giving money to some existential risk charity.
All the other terms are negative, at least for me. And unless I actually overcome excuses, akrasia, etc to donate a lot, I think it’ll all have been a mutually detrimental waste of time.
Not helping. I was referring to the the moral value of donations as an argument for choosing to know, as opposed to not knowing. You don’t seem to address that in your reply (did I miss something?).
Oh, I see. Well, I guess it depends upon how much I eventually donate and how much of an incremental difference that makes.
It would certainly be better to just donate, AND to also not know anything about anything dangerous. I’m not even sure that’s possible, though. For all we know, just knowing about any of this is enough to land you in a lot of trouble either in the causal future or elsewhere.
As I understand, you donate (and plan to in the future) to existential risk charities, and that is one of the consequences of you having come across that link. How does this compute into net negative, in your estimation, or are you answering a different question?
Sure I want to donate. But if you express it as a hypothetical choice between being a person who didn’t know about any of this and had no way of finding out, versus what I have now, I choose the former. Though since that is not an available choice, it is a somewhat academic question.
I can’t believe to hear this from a person who wrote about Ugh fields. I can’t believe to read a plead for ignorance on a blog devoted to refining rationality. Ignorance is bliss, is that the new motto now?
Well look, one has to do cost/benefit calculations, not just blindly surge forward in some kind of post-enlightenment fervor. To me, it seems like there is only one positive term in the equation:: the altrustic value of giving money to some existential risk charity.
All the other terms are negative, at least for me. And unless I actually overcome excuses, akrasia, etc to donate a lot, I think it’ll all have been a mutually detrimental waste of time.
There is only one final criterion, the human decision problem. It trumps any other rule, however good or useful.
(You appeal to particular heuristics, using the feeling of indignation as a rhetoric weapon.)
Not helping. I was referring to the the moral value of donations as an argument for choosing to know, as opposed to not knowing. You don’t seem to address that in your reply (did I miss something?).
Oh, I see. Well, I guess it depends upon how much I eventually donate and how much of an incremental difference that makes.
It would certainly be better to just donate, AND to also not know anything about anything dangerous. I’m not even sure that’s possible, though. For all we know, just knowing about any of this is enough to land you in a lot of trouble either in the causal future or elsewhere.