The various silly people who think I want to keep the flesh around forever, or constrain all adults to the formal outline of an FAI, are only, of course, making things up; their imagination is not wide enough to understand the concept of some possible AIs being people, and some possible AIs being something else.
Presuming that I am one of these “silly people”: Quite the opposite, and it is hard for me to imagine how you could fail to understand that from reading my comments. It is because I can imagine these things, and see that they have important implications for your ideas, and see that you have failed to address them, that I infer that you are not thinking about them.
And this post reveals more failings along those lines; imagining that death is something too awful for a God to allow is incompatible with viewing intelligent life in the universe as an extended system of computations, and again suggests you are overly-attached to linking agency and identity to discrete physical bodies. The way you present this, as well as the discussion in the comments, suggests you think “death” is a thing that can be avoided by living indefinitely; this, also, is evidence of not thinking deeply about identity in deep time. The way you speak about the danger facing you—not the danger facing life, which I agree with you about; but the personal danger of death—suggests that you want to personally live on beyond the Singularity; whereas more coherent interpretations of your ideas that I’ve heard from Mike Vassar imply annihilation or equivalent transformation of all of us by the day after it. It seems most likely to me either that you’re intentionally concealing that the good outcomes of your program still involve the “deaths” of all humans, or that you just haven’t thought about it very hard.
What I’ve read of your ideas for the future suffers greatly from your not having worked out (at least on paper) notions of identity and agency. You say you want to save people, but you haven’t said what that means. I think that you’re trying to apply verbs to a scenario that we don’t have the nouns for yet.
It is extraordinarily difficult to figure out how to use volunteers. Almost any nonprofit trying to accomplish a skilled-labor task has many more people who want to volunteer their time than they can use. The Foresight Institute has the same problem: People want to donate time instead of money, but it’s really, really hard to use volunteers. If you know a solution to this, by all means share.
The SIAI is Eliezer’s thing. Eliezer is constitutionally disinclined to value the work of other people. If the volunteers really want to help, they should take what I read as Eliezer’s own advice in this post, and start their own organization.
And this post reveals more failings along those lines; imagining that death is something too awful for a God to allow is incompatible with viewing intelligent life in the universe as an extended system of computations, and again suggests you are overly-attached to linking agency and identity to discrete physical bodies. The way you present this, as well as the discussion in the comments, suggests you think “death” is a thing that can be avoided by living indefinitely; this, also, is evidence of not thinking deeply about identity in deep time. The way you speak about the danger facing you—not the danger facing life, which I agree with you about; but the personal danger of death—suggests that you want to personally live on beyond the Singularity; whereas more coherent interpretations of your ideas that I’ve heard from Mike Vassar imply annihilation or equivalent transformation of all of us by the day after it. It seems most likely to me either that you’re intentionally concealing that the good outcomes of your program still involve the “deaths” of all humans, or that you just haven’t thought about it very hard.
What I’ve read of your ideas for the future suffers greatly from your not having worked out (at least on paper) notions of identity and agency. You say you want to save people, but you haven’t said what that means. I think that you’re trying to apply verbs to a scenario that we don’t have the nouns for yet.
The SIAI is Eliezer’s thing. Eliezer is constitutionally disinclined to value the work of other people. If the volunteers really want to help, they should take what I read as Eliezer’s own advice in this post, and start their own organization.