If AGI is developed today, it would be net beneficial for humanity’s long-term future
This seems like a request to condition on an event of infinitesimal probability. I have no idea how to interpret this question. I feel like you’re not going for “if there is some secret government project to make an AGI, how good do you think they are at aligning it”?
This seems like a request to condition on an event of infinitesimal probability. I have no idea how to interpret this question. I feel like you’re not going for “if there is some secret government project to make an AGI, how good do you think they are at aligning it”?