Even if you suppose that there are extremely good non-human futures, creating a new kind of life and unleashing it upon the world is a huge deal, with enormous ethical/philosophical implications! To unilaterally make a decision that would drastically affect (and endanger) the lives of everyone on earth (human and non-human) seems extremely bad, even if you had very good reasons to believe that this ends well (which as far as I can tell, you don’t).
I have sympathy for the idea of wanting AI systems to be able to pursue lives they find fulfilling and to find their own kinds of value, for the same reason I would, upon encountering alien life, want to let those aliens find value in their own ways.
But your post seems to imply that we should just give up on trying to positively affect the future, spend no real thought on what would be the biggest decision ever made in all of history, all based on a hunch that everything is guaranteed to end well no matter what we do? This perspective, to me, comes off as careless, selfish, and naive.
I guess it comes down to what one think the goal of all life is
I would say that seeking all such “values” would be part of it, and you don’t need billions of different creatures to do that when one optimal being could do it more efficiently
Even if you suppose that there are extremely good non-human futures, creating a new kind of life and unleashing it upon the world is a huge deal, with enormous ethical/philosophical implications! To unilaterally make a decision that would drastically affect (and endanger) the lives of everyone on earth (human and non-human) seems extremely bad, even if you had very good reasons to believe that this ends well (which as far as I can tell, you don’t).
I have sympathy for the idea of wanting AI systems to be able to pursue lives they find fulfilling and to find their own kinds of value, for the same reason I would, upon encountering alien life, want to let those aliens find value in their own ways.
But your post seems to imply that we should just give up on trying to positively affect the future, spend no real thought on what would be the biggest decision ever made in all of history, all based on a hunch that everything is guaranteed to end well no matter what we do? This perspective, to me, comes off as careless, selfish, and naive.
I guess it comes down to what one think the goal of all life is
I would say that seeking all such “values” would be part of it, and you don’t need billions of different creatures to do that when one optimal being could do it more efficiently