I agree that one could do something similar with other tech than neat biotech, but I don’t think this proves that Kudzugoth Alignment is as difficult as general alignment. I think aligning AI to achieve something specific is likely to be a lot easier than aligning AI in general. It’s questionable whether the latter is even possible and unclear what it means to achieve it.
I agree that one could do something similar with other tech than neat biotech, but I don’t think this proves that Kudzugoth Alignment is as difficult as general alignment. I think aligning AI to achieve something specific is likely to be a lot easier than aligning AI in general. It’s questionable whether the latter is even possible and unclear what it means to achieve it.