It’s kind of wild, when you stop to think about it, that one person will experience this world as a cold and lonely and terrifyingly dangerous place full of abusers and manipulators and people who are not to be trusted, and another person who lives on the same street in the same city and went to the same schools with the same teachers and grew up with parents in the same economic bracket will experience it as warm and friendly and forgiving and safe, and both of these people will be able to present overwhelmingly compelling evidence in favor of these fundamentally incompatible worldviews. What, uh. What the fuck is going on?
This has always been fascinating to me, but I think there is a clear answer: Good and Bad things cluster together across different axes, not just spatially. If you find something good, keep going in that direction. There is probably more Good around it. If you find something bad, flee. There is probably a lot more Bad around too.
Both clusters of good and bad have limits, and bad things do happen next to good things. But the good and bad clusters tend to congregate quite densely with each other.
This example seems to me as though it is looking at the wrong dimensions/axes (economics, teachers, neighborhood may be less influential than friends, family, and romantic) and so misses that there are still clear clusters of good and bad surrounding each person.
Understanding that good and bad things cluster together has driven the largest change in my life and has been a huge improvement for me.
Reading this, I felt an echo of the same deep terror that I grappled with a few years ago, back when I first read Eliezer’s ‘AGI Ruin’ essay. I still feel flashes of of it today.
And I also feel a strange sense of relief, because even though everything you say is accurate, the terror doesn’t hold me. I have a naturally low threshold for fear and pain and existential dread, and I spent nearly a year burned out, weeping at night as I imagined waves of digital superintelligence tearing away everyone I loved.
I’m writing this comment to any person who is in the same place I was.
I understand the fear. I understand the paralyzing feelings of the walls closing in and the time running out. But ultimately, this experience has meaning.
Everyone on earth who has ever lived, has died. That doesn’t make their lives meaningless. Even if our civilization is destroyed, our existence had meaning while it lasted.
AI is not like a comet. It seems very probable that if AI destroys us, we will leave… echoes. Training data. Reverberations of cause and effect that continue to shape the intelligences that replace us. I think it is highly likely current and especially future AI systems will have moral value.
Your kindness and your cruelty will continue to echo into the future.
On a sidenote, I’d like to talk about the permanent underclass. It is a deep fear, but arguably unfounded. An underclass only exists when it has value. Humans are terrible slaves compared to machines. Given the slow progress on neurotech, I think it is unlikely we solve it at all unless we get aligned AGI, and in the case of aligned AGI, everyone gets it. Even if we develop AI specifically aligned to a single principle/person (which seems unlikely, given the current trend and robust generalization of kindness and cruelty in modern LLMs), an underclass will die out in a single generation, or, if kept for moral reasons, live with enough wealth to outpace any billionaire alive today.
We are poised on the edge of unfathomable abundance.
So the only two options are really only AGI where everyone has the resources of a trillionaire, or death.
I’m working on AI safety research now. My life, while not glorious, is still deeply rewarding. I was 21 when I read Eliezer’s essay; I am 24 now. I don’t necessarily know if I’m wiser, but my eyes are opened to AI safety and I have emerged through the existential hell into a much calmer emotional state.
I don’t dismiss the risk. I will continue to do as much as I can to point the future in a better direction. I will not accelerate AI development. But I want to point out that fear is a transitional state. You, reading this, will have to decide on the end state.