I think it’s likely that another cultural revolution could happen, and this could adversely affect the future if it happens simultaneously with a transition into an AI based economy.
This seems to be ignoring the part of my comment at the top of this sub-thread, where I said “[...] has also made me more pessimistic about non-AGI or delayed-AGI approaches to a positive long term future (e.g., the Long Reflection).” In other words, I’m envisioning a long period of time in which humanity has the technical ability to create an AGI but is deliberately holding off to better figure out our values or otherwise perfect safety/alignment. I’m worried about something like the Cultural Revolution happening in this period, and you don’t seem to be engaging with that concern?
Ahh. To be honest, I read that, but then responded to something different. I assumed you were just expressing general pessimism, since there’s no guarantee that we would converge on good values upon a long reflection (and you recently viscerally realized that values are very arbitrary).
Now I see that your worry is more narrow, in that the cultural revolution might happen during this period, and would act unwisely to create the AGI during its wake. I guess this seems quite plausible, and is an important concern, though I personally am skeptical that anything like the long reflection will ever happen.
Ahh. To be honest, I read that, but then responded to something different. I assumed you were just expressing general pessimism, since there’s no guarantee that we would converge on good values upon a long reflection (and you recently viscerally realized that values are very arbitrary).
I guess I was also expressing a more general update towards more pessimism, where even if nothing happens during the Long Reflection that causes it to prematurely build an AGI, other new technologies that will be available/deployed during the Long Reflection could also invalidate the historical tendency for “Cultural Revolutions” to dissipate over time and for moral evolution to continue along longer-term trends.
though I personally am skeptical that anything like the long reflection will ever happen.
Sure, I’m skeptical of that too, but given my pessimism about more direct routes to building an aligned AGI, I thought it might be worth pushing for it anyway.
This seems to be ignoring the part of my comment at the top of this sub-thread, where I said “[...] has also made me more pessimistic about non-AGI or delayed-AGI approaches to a positive long term future (e.g., the Long Reflection).” In other words, I’m envisioning a long period of time in which humanity has the technical ability to create an AGI but is deliberately holding off to better figure out our values or otherwise perfect safety/alignment. I’m worried about something like the Cultural Revolution happening in this period, and you don’t seem to be engaging with that concern?
Ahh. To be honest, I read that, but then responded to something different. I assumed you were just expressing general pessimism, since there’s no guarantee that we would converge on good values upon a long reflection (and you recently viscerally realized that values are very arbitrary).
Now I see that your worry is more narrow, in that the cultural revolution might happen during this period, and would act unwisely to create the AGI during its wake. I guess this seems quite plausible, and is an important concern, though I personally am skeptical that anything like the long reflection will ever happen.
I guess I was also expressing a more general update towards more pessimism, where even if nothing happens during the Long Reflection that causes it to prematurely build an AGI, other new technologies that will be available/deployed during the Long Reflection could also invalidate the historical tendency for “Cultural Revolutions” to dissipate over time and for moral evolution to continue along longer-term trends.
Sure, I’m skeptical of that too, but given my pessimism about more direct routes to building an aligned AGI, I thought it might be worth pushing for it anyway.