I wasn’t expecting such a swift response! Unfortunately I’m a little too tipsy to go through the whole paper right now (I’ll get to it with the cold, godless Sunday morn), but I think I’d actually be more interested in the paper you reference at the start, about catastrophism bias. I completely agree that such a bias exists! But still, I don’t think it’s obvious that we’ll develop an AGI that can solve the socio-ecological problems I mentioned earlier before they inhibit the research itself. As such I’m more concerned about the design of a benevolent agrarian revolution before we get to the singularity stuff.
I guess in my mind there’s a big tension here—the societal mechanisms that right now support the development of powerful AGI are also horribly brutal and unjust. Could the research continue in a more just and ecologically-sound system?
I wasn’t expecting such a swift response! Unfortunately I’m a little too tipsy to go through the whole paper right now (I’ll get to it with the cold, godless Sunday morn), but I think I’d actually be more interested in the paper you reference at the start, about catastrophism bias. I completely agree that such a bias exists! But still, I don’t think it’s obvious that we’ll develop an AGI that can solve the socio-ecological problems I mentioned earlier before they inhibit the research itself. As such I’m more concerned about the design of a benevolent agrarian revolution before we get to the singularity stuff.
I guess in my mind there’s a big tension here—the societal mechanisms that right now support the development of powerful AGI are also horribly brutal and unjust. Could the research continue in a more just and ecologically-sound system?