I’m with Calvin. I’m think misaligned agi is more likely to do local bad things when first invented . There are more partially rational agis than fully rational ones, so I expect those to be found first.
I do think it important who develops safe agi first. Whether there is the chance for a last minute power grab during the development process or not.
I’m with Calvin. I’m think misaligned agi is more likely to do local bad things when first invented . There are more partially rational agis than fully rational ones, so I expect those to be found first.
I do think it important who develops safe agi first. Whether there is the chance for a last minute power grab during the development process or not.