By unpredictable I mean that nobody really predicted:
(Edit: 1-3 removed to keep a safer distance from object-level politics, especially on AF)
4 Russia and China adopted communism even though they were extremely poor. (They were ahead of the US in gender equality and income equality for a time due to that, even though they were much poorer.)
None of these seem well-explained by your “rich society” model. My current model is that social media and a decrease in the perception of external threats relative to internal threats both favor more virtue signaling, which starts spiraling out of control after some threshold is crossed. But the actual virtue(s) that end up being signaled/reinforced (often at the expense of other virtues) is historically contingent and hard to predict.
I could be wrong here, but the stuff you mentioned as counterexamples to my model appear either ephemeral, or too particular. The “last few years” of political correctness is hardly enough time to judge world-trends by, right? By contrast, the stuff I mentioned (end of slavery, explicit policies against racism and war) seem likely to stick and stay with us for decades, if not centuries.
We can explain this after the fact by saying that the Left is being forced by impersonal social dynamics, e.g., runaway virtue signaling, to over-correct, but did anyone predict this ahead of time?
When I listen to old recordings of right wing talk show hosts from decades ago, they seem to be saying the same stuff that current people are saying today, about political correctness and being forced out of academia for saying things that are deemed harmful by the social elite, or about the Left being obsessed by equality and identity. So I would definitely say that a lot of people predicted this would happen.
The main difference is that it’s now been amplified as recent political events have increased polarization, the people with older values are dying of old age or losing their power, and we have social media that makes us more aware of what is happening. But in hindsight I think this scenario isn’t that surprising.
Russia and China adopted communism even though they were extremely poor
Of course, you can point to a few examples of where my model fails. I’m talking about the general trends rather than the specific cases. If we think in terms of world history, I would say that Russia in the early 20th century was “rich” in the sense that it was much richer than countries in previous centuries and this enabled it to implement communism in the first place. Government power waxes and wanes, but over time I think its power has definitely gone up as the world has gotten richer, and I think this could have been predicted.
When I listen to old recordings of right wing talk show hosts from decades ago, they seem to be saying the same stuff that current people are saying today, about political correctness and being forced out of academia for saying things that are deemed harmful by the social elite, or about the Left being obsessed by equality and identity. So I would definitely say that a lot of people predicted this would happen.
I think what’s surprising is that although academia has been left-leaning for decades, the situation had been relatively stable until the last few years, when things suddenly progressed very quickly, to the extent that even professors who firmly belong on the Left are being silenced or driven out of academia for disagreeing with an ever-changing party line. (It used to be that universities at least paid lip service to open inquiry, overt political correctness was confined to non-STEM fields, and there was relatively open discussion among people who managed to get into academia in the first place. At least that’s my impression.) Here are a couple of links for you if you haven’t been following the latest developments:
Afterward, several faculty who had attended the gathering told me they were afraid to speak in my defense. One, a full professor and past chair, told me that what had happened was very wrong but he was scared to talk.
Another faculty member, who was originally from China and lived through the Cultural Revolution told me it was exactly like the shaming sessions of Maoist China, with young Red Guards criticizing and shaming elders they wanted to embarrass and remove.
(BTW I came across this without specifically searching for “cultural revolution”.) Note that the author is in favor of carbon taxes in general and supported past attempts to pass carbon taxes, and was punished for disagreeing with a specific proposal that he found issue with. How many people (if any) predicted that things like this would be happening on a regular basis at this point?
I could be wrong here, but the stuff you mentioned appear either ephemeral, or too particular. The “last few years” of political correctness is hardly enough time to judge world-trends by, right? By contrast, the stuff I mentioned (end of slavery, explicit policies against racism and war) seem likely to stick and stay with us for decades, if not centuries.
It sounds like you think that something like another Communist Revolution or Cultural Revolution could happen (that emphasizes some random virtues at the expense of others), but the effect would be temporary and after it’s over, longer term trends will reassert themselves. Does that seem fair?
In the context of AI strategy though (specifically something like the Long Reflection), I would be worried that a world in the grips of another Cultural Revolution would be very tempted to (or impossible to refrain from) abandoning the plan to delay AGI and instead build and lock their values into a superintelligent AI ASAP, even if that involves more safety risk. Predictability of longer term moral trends (even if true) doesn’t seem to help with this concern.
It sounds like you think that something like another Communist Revolution or Cultural Revolution could happen (that emphasizes some random virtues at the expense of others), but the effect would be temporary and after it’s over, longer term trends will reassert themselves. Does that seem fair?
That’s pretty fair.
I think it’s likely that another cultural revolution could happen, and this could adversely affect the future if it happens simultaneously with a transition into an AI based economy. However, the deviations from long-term trends are very hard to predict, as you point out, and we should know about the specifics more as we get further along. In the absence of concrete details, I find it far more helpful to use information from long-term trends rather than worrying about specific scenarios.
I think it’s likely that another cultural revolution could happen, and this could adversely affect the future if it happens simultaneously with a transition into an AI based economy.
This seems to be ignoring the part of my comment at the top of this sub-thread, where I said “[...] has also made me more pessimistic about non-AGI or delayed-AGI approaches to a positive long term future (e.g., the Long Reflection).” In other words, I’m envisioning a long period of time in which humanity has the technical ability to create an AGI but is deliberately holding off to better figure out our values or otherwise perfect safety/alignment. I’m worried about something like the Cultural Revolution happening in this period, and you don’t seem to be engaging with that concern?
Ahh. To be honest, I read that, but then responded to something different. I assumed you were just expressing general pessimism, since there’s no guarantee that we would converge on good values upon a long reflection (and you recently viscerally realized that values are very arbitrary).
Now I see that your worry is more narrow, in that the cultural revolution might happen during this period, and would act unwisely to create the AGI during its wake. I guess this seems quite plausible, and is an important concern, though I personally am skeptical that anything like the long reflection will ever happen.
Ahh. To be honest, I read that, but then responded to something different. I assumed you were just expressing general pessimism, since there’s no guarantee that we would converge on good values upon a long reflection (and you recently viscerally realized that values are very arbitrary).
I guess I was also expressing a more general update towards more pessimism, where even if nothing happens during the Long Reflection that causes it to prematurely build an AGI, other new technologies that will be available/deployed during the Long Reflection could also invalidate the historical tendency for “Cultural Revolutions” to dissipate over time and for moral evolution to continue along longer-term trends.
though I personally am skeptical that anything like the long reflection will ever happen.
Sure, I’m skeptical of that too, but given my pessimism about more direct routes to building an aligned AGI, I thought it might be worth pushing for it anyway.
By unpredictable I mean that nobody really predicted:
(Edit: 1-3 removed to keep a safer distance from object-level politics, especially on AF)
4 Russia and China adopted communism even though they were extremely poor. (They were ahead of the US in gender equality and income equality for a time due to that, even though they were much poorer.)
None of these seem well-explained by your “rich society” model. My current model is that social media and a decrease in the perception of external threats relative to internal threats both favor more virtue signaling, which starts spiraling out of control after some threshold is crossed. But the actual virtue(s) that end up being signaled/reinforced (often at the expense of other virtues) is historically contingent and hard to predict.
I could be wrong here, but the stuff you mentioned as counterexamples to my model appear either ephemeral, or too particular. The “last few years” of political correctness is hardly enough time to judge world-trends by, right? By contrast, the stuff I mentioned (end of slavery, explicit policies against racism and war) seem likely to stick and stay with us for decades, if not centuries.
When I listen to old recordings of right wing talk show hosts from decades ago, they seem to be saying the same stuff that current people are saying today, about political correctness and being forced out of academia for saying things that are deemed harmful by the social elite, or about the Left being obsessed by equality and identity. So I would definitely say that a lot of people predicted this would happen.
The main difference is that it’s now been amplified as recent political events have increased polarization, the people with older values are dying of old age or losing their power, and we have social media that makes us more aware of what is happening. But in hindsight I think this scenario isn’t that surprising.
Of course, you can point to a few examples of where my model fails. I’m talking about the general trends rather than the specific cases. If we think in terms of world history, I would say that Russia in the early 20th century was “rich” in the sense that it was much richer than countries in previous centuries and this enabled it to implement communism in the first place. Government power waxes and wanes, but over time I think its power has definitely gone up as the world has gotten richer, and I think this could have been predicted.
I think what’s surprising is that although academia has been left-leaning for decades, the situation had been relatively stable until the last few years, when things suddenly progressed very quickly, to the extent that even professors who firmly belong on the Left are being silenced or driven out of academia for disagreeing with an ever-changing party line. (It used to be that universities at least paid lip service to open inquiry, overt political correctness was confined to non-STEM fields, and there was relatively open discussion among people who managed to get into academia in the first place. At least that’s my impression.) Here are a couple of links for you if you haven’t been following the latest developments:
Bret Weinstein’s talk, How the Magic Trick is Done
https://cliffmass.blogspot.com/2019/10/the-university-of-washington-should-not.html
A quote from the second link:
(BTW I came across this without specifically searching for “cultural revolution”.) Note that the author is in favor of carbon taxes in general and supported past attempts to pass carbon taxes, and was punished for disagreeing with a specific proposal that he found issue with. How many people (if any) predicted that things like this would be happening on a regular basis at this point?
It sounds like you think that something like another Communist Revolution or Cultural Revolution could happen (that emphasizes some random virtues at the expense of others), but the effect would be temporary and after it’s over, longer term trends will reassert themselves. Does that seem fair?
In the context of AI strategy though (specifically something like the Long Reflection), I would be worried that a world in the grips of another Cultural Revolution would be very tempted to (or impossible to refrain from) abandoning the plan to delay AGI and instead build and lock their values into a superintelligent AI ASAP, even if that involves more safety risk. Predictability of longer term moral trends (even if true) doesn’t seem to help with this concern.
That’s pretty fair.
I think it’s likely that another cultural revolution could happen, and this could adversely affect the future if it happens simultaneously with a transition into an AI based economy. However, the deviations from long-term trends are very hard to predict, as you point out, and we should know about the specifics more as we get further along. In the absence of concrete details, I find it far more helpful to use information from long-term trends rather than worrying about specific scenarios.
This seems to be ignoring the part of my comment at the top of this sub-thread, where I said “[...] has also made me more pessimistic about non-AGI or delayed-AGI approaches to a positive long term future (e.g., the Long Reflection).” In other words, I’m envisioning a long period of time in which humanity has the technical ability to create an AGI but is deliberately holding off to better figure out our values or otherwise perfect safety/alignment. I’m worried about something like the Cultural Revolution happening in this period, and you don’t seem to be engaging with that concern?
Ahh. To be honest, I read that, but then responded to something different. I assumed you were just expressing general pessimism, since there’s no guarantee that we would converge on good values upon a long reflection (and you recently viscerally realized that values are very arbitrary).
Now I see that your worry is more narrow, in that the cultural revolution might happen during this period, and would act unwisely to create the AGI during its wake. I guess this seems quite plausible, and is an important concern, though I personally am skeptical that anything like the long reflection will ever happen.
I guess I was also expressing a more general update towards more pessimism, where even if nothing happens during the Long Reflection that causes it to prematurely build an AGI, other new technologies that will be available/deployed during the Long Reflection could also invalidate the historical tendency for “Cultural Revolutions” to dissipate over time and for moral evolution to continue along longer-term trends.
Sure, I’m skeptical of that too, but given my pessimism about more direct routes to building an aligned AGI, I thought it might be worth pushing for it anyway.