A list of outcomes possible in the future (in order of my preference):
We create AI which corresponds to my values.
Life on Earth persists under my value set.
Life on Earth is totally exterminated.
Life on Earth persists under its current value set.
We create an AI which does not correspond to my values.
If LW is not trying to eradicate the scourge of transphobia, than clearly SIAI has moved from 1 to 5, and I should be trying to dismantle it, rather than fund it.
So to be clear, you are claiming that the destruction of all life on Earth is a better alternative than life continuing with the common current values?
(5) We create an AI which does not correspond to my values.
So part of the whole point of attempts to things like CEV is that they will (ideally) not use any individual’s fixed values but rather will try to use what everyone’s values would be if they were smarter and knew more.
If LW is not trying to eradicate the scourge of transphobia, than clearly SIAI has moved from 1 to 5, and I should be trying to dismantle it, rather than fund it.
If your value set is so focused on the complete destruction of the world rather than let any deviation from your values to be implemented, then I suspect that LW and SI were already trying to accomplish something you’d regard as 5. Moreover, it seems that you are confused about priorities: LW isn’t an organization devoted to dealing with LGBTQE issues. You might as well complain that LW isn’t trying to eradicate malaria. The goal of LW is to improve rationality, and the goal of SI is to construct safe general AI. If one or both of those happens to solve other problems or result in a value shift making things better for trans individuals then that will be a consequence, but it doesn’t make it their job to do so.
Frankly, any value system which says “I’d rather have all life destroyed then everyone live under a value system slightly different than my own” seems more like something out of the worst sort of utopian fanaticism than anything else. One of the major ways human society has improved over time and become more peaceful is that we’ve learned that we don’t have to frame everything as an existential struggle. Sometimes it does actually make sense to compromise, or at least, wait to resolve things. We live in a era of truly awesome weaponry, and it is only this willingness to place the survival of humanity over disagreements in values that has seen us to this day. It is from the moderation of Reagan, Nixon, Carter, Kruschev, Breznev, Andropov and others that we are around to have this discussion instead of trying to desperately survive in the crumbled, radioactive ruins of human civilization.
A list of outcomes possible in the future (in order of my preference):
We create AI which corresponds to my values.
Life on Earth persists under my value set.
Life on Earth is totally exterminated.
Life on Earth persists under its current value set.
We create an AI which does not correspond to my values.
If LW is not trying to eradicate the scourge of transphobia, than clearly SIAI has moved from 1 to 5, and I should be trying to dismantle it, rather than fund it.
So to be clear, you are claiming that the destruction of all life on Earth is a better alternative than life continuing with the common current values?
So part of the whole point of attempts to things like CEV is that they will (ideally) not use any individual’s fixed values but rather will try to use what everyone’s values would be if they were smarter and knew more.
If your value set is so focused on the complete destruction of the world rather than let any deviation from your values to be implemented, then I suspect that LW and SI were already trying to accomplish something you’d regard as 5. Moreover, it seems that you are confused about priorities: LW isn’t an organization devoted to dealing with LGBTQE issues. You might as well complain that LW isn’t trying to eradicate malaria. The goal of LW is to improve rationality, and the goal of SI is to construct safe general AI. If one or both of those happens to solve other problems or result in a value shift making things better for trans individuals then that will be a consequence, but it doesn’t make it their job to do so.
Frankly, any value system which says “I’d rather have all life destroyed then everyone live under a value system slightly different than my own” seems more like something out of the worst sort of utopian fanaticism than anything else. One of the major ways human society has improved over time and become more peaceful is that we’ve learned that we don’t have to frame everything as an existential struggle. Sometimes it does actually make sense to compromise, or at least, wait to resolve things. We live in a era of truly awesome weaponry, and it is only this willingness to place the survival of humanity over disagreements in values that has seen us to this day. It is from the moderation of Reagan, Nixon, Carter, Kruschev, Breznev, Andropov and others that we are around to have this discussion instead of trying to desperately survive in the crumbled, radioactive ruins of human civilization.