I’m in the same boat. I’m not that worried about my own life, in the general scheme of things. I fully expect I’ll die, and probably earlier than I would in a world without AI development. What really cuts me up is the idea that there will be no future to speak of, that all my efforts won’t contribute to something, some small influence on other people enjoying their lives at a later time. A place people feel happy and safe and fulfilled.
If I had a credible offer to guarantee that future in exchange for my life, I think I’d take it. (I’m currently healthy, more than half my life left to live, assuming average life expectancy)
Sometimes I try to take comfort in many-worlds, that there exist different timelines where humanity manages to regulate AI or align it with human values (whatever those are). Given that I have no capacity to influence those timelines though, it doesn’t feel like they are meaningfully there.
I’m in the same boat. I’m not that worried about my own life, in the general scheme of things. I fully expect I’ll die, and probably earlier than I would in a world without AI development. What really cuts me up is the idea that there will be no future to speak of, that all my efforts won’t contribute to something, some small influence on other people enjoying their lives at a later time. A place people feel happy and safe and fulfilled.
If I had a credible offer to guarantee that future in exchange for my life, I think I’d take it.
(I’m currently healthy, more than half my life left to live, assuming average life expectancy)
Sometimes I try to take comfort in many-worlds, that there exist different timelines where humanity manages to regulate AI or align it with human values (whatever those are). Given that I have no capacity to influence those timelines though, it doesn’t feel like they are meaningfully there.