I agree and I think this is critical. The standard of getting >90% of the possible value from our lightcone, or similar, seems ridiculously high given the seemingly very real possibility of achieving zero or negative value.
And it seems certain that there’s no absolute standard for achieving human values. What they are is path dependent.
But we can still achieve an unimaginably good future by achieving ASI that does anything that humans roughly want.
I agree and I think this is critical. The standard of getting >90% of the possible value from our lightcone, or similar, seems ridiculously high given the seemingly very real possibility of achieving zero or negative value.
And it seems certain that there’s no absolute standard for achieving human values. What they are is path dependent.
But we can still achieve an unimaginably good future by achieving ASI that does anything that humans roughly want.