One countervailing thought: I want AGI to be developed in a high trust, low-scarcity, social-pyshoclogical context, because that seems like it matters a lot for safety.
Slow growth enough and society as a whole becomes a lot more bitter and cutthroat?
One countervailing thought: I want AGI to be developed in a high trust, low-scarcity, social-pyshoclogical context, because that seems like it matters a lot for safety.
Slow growth enough and society as a whole becomes a lot more bitter and cutthroat?