I agree there is a sense in which AI alignment research is today alchemy, but I think we are making progress to turn it into chemistry. That said, that point doesn’t seem relevant much to the rest of your position, which is more about how humans stay relevant in a world with more powerful beings than unaugmented humans.
Even if we did turn it into chemistry, it’d be a wasteful and delusional AI system. Akin to all of humanity focusing all their living and waking moments to build ant colonies and feeding them. How irrational does that sound? Won’t an AI eventually realize that?
I agree there is a sense in which AI alignment research is today alchemy, but I think we are making progress to turn it into chemistry. That said, that point doesn’t seem relevant much to the rest of your position, which is more about how humans stay relevant in a world with more powerful beings than unaugmented humans.
Even if we did turn it into chemistry, it’d be a wasteful and delusional AI system. Akin to all of humanity focusing all their living and waking moments to build ant colonies and feeding them. How irrational does that sound? Won’t an AI eventually realize that?