“This is strategically relevant because I’m imagining AGI strategies playing out in a world where everything is already going crazy, while other people are imagining AGI strategies playing out in a world that looks kind of like 2018 except that someone is about to get a decisive strategic advantage.” -Christiano
This is a tangent but I don’t know when else I would comment on this otherwise. I think one of the biggest potential effects in an acceleration timeline is that things get really memetically weird and unstable. This point was harder to make before covid but imagine the incoherence of institutional responses getting much worse than that. I think a world in which memetic conflict ramps up, local ability to do sense making with your peers gets worse as a side effect. People randomly yelling at you that you need to be paying attention to X. The best thing I know how to do (which may be wholly inadequate) is deciding to invest in high trust connections and joint meaning making. This seems especially likely to be undervalued in a community of high-decouplers. Spending time in peacetime practicing convergence on things that don’t have high stakes, like working through each other’s emotional processing back logs, practices some of the same moves that become critical in wartime, when it seems like lots of people are losing touch with any sort of consensus reality as there are now polarized competing consensus realities. To tie it back to the portfolio question, I do expect to see worse instability in peer groups when some people are making huge sums and others are getting wiped out by a high variance economy.
This is another reason to take /u/trevor1′s advice and limit your mass media diet today. If you think propaganda is going to keep slowly ramping up in terms of effectiveness, then you want to avoid boiling the frog by becoming slightly crazier each year. Ideally you should really try to find some peers who prefer not to mindkill themselves either.
“This is strategically relevant because I’m imagining AGI strategies playing out in a world where everything is already going crazy, while other people are imagining AGI strategies playing out in a world that looks kind of like 2018 except that someone is about to get a decisive strategic advantage.” -Christiano
This is a tangent but I don’t know when else I would comment on this otherwise. I think one of the biggest potential effects in an acceleration timeline is that things get really memetically weird and unstable. This point was harder to make before covid but imagine the incoherence of institutional responses getting much worse than that. I think a world in which memetic conflict ramps up, local ability to do sense making with your peers gets worse as a side effect. People randomly yelling at you that you need to be paying attention to X. The best thing I know how to do (which may be wholly inadequate) is deciding to invest in high trust connections and joint meaning making. This seems especially likely to be undervalued in a community of high-decouplers. Spending time in peacetime practicing convergence on things that don’t have high stakes, like working through each other’s emotional processing back logs, practices some of the same moves that become critical in wartime, when it seems like lots of people are losing touch with any sort of consensus reality as there are now polarized competing consensus realities. To tie it back to the portfolio question, I do expect to see worse instability in peer groups when some people are making huge sums and others are getting wiped out by a high variance economy.
This is another reason to take /u/trevor1′s advice and limit your mass media diet today. If you think propaganda is going to keep slowly ramping up in terms of effectiveness, then you want to avoid boiling the frog by becoming slightly crazier each year. Ideally you should really try to find some peers who prefer not to mindkill themselves either.