It is a valid worry. But under the right conditions, where we take care not to let evolutionary dynamics take hold, we might be able to get a better shot at a friendly singularity than any other way.
Possibly. But I’d rather use selected human geniuses with the right ideas copied and sped up, and wait for them to crack FAI before going further (even if FAI doesn’t give a powerful intelligence explosion—then FAI is simply formalization and preservation of preference, rather than power to enact this preference).
It is a valid worry. But under the right conditions, where we take care not to let evolutionary dynamics take hold, we might be able to get a better shot at a friendly singularity than any other way.
Possibly. But I’d rather use selected human geniuses with the right ideas copied and sped up, and wait for them to crack FAI before going further (even if FAI doesn’t give a powerful intelligence explosion—then FAI is simply formalization and preservation of preference, rather than power to enact this preference).