I am not an AI researcher, but it seems analogous to the acceptance of mortality for most people. Throughout history, almost everyone has had to live with the knowledge that they will inevitably die, perhaps suddenly. Many methods of coping have been utilized, but at the end of the day it seems like something that human psychology is just… equipped to handle. x-risk is much worse than personal mortality, but you know, failure to multiply and all that.
Idk I’m a doomer and I haven’t been able to handle it well at all. If I were told “You have cancer, you’re expected to live 5-10 more years”, I’d at least have a few comforts
I’d know that I would be missed, by my family at least, for a few years.
I’d know that, to some extent, my “work would live on” in the form of good deeds I’ve done, people I’ve impacted through effective altruism.
I’d have the comfort of knowing that even I’d been dead for centuries I could still “live on” in the sense that other humans (and indeed, many nonhuman) would share brain design with me, and have drives for food, companionship, empathy, curiosity ect. A super AI by contrast, is just so alien and cold that I can’t consider it my brainspace cousin.
If I were to share my cancer diagnosis with normies, I would get sympathy. But there are very few “safe spaces” where I can share my fear of UFAI risk without getting looked at funny.
The closest community I’ve found are the environmentalist doomers, and although I don’t actually think the environment is close to collapse, I do find it somewhat cathartic to read other people’s accounts of being sad that world is going to die.
I am not an AI researcher, but it seems analogous to the acceptance of mortality for most people. Throughout history, almost everyone has had to live with the knowledge that they will inevitably die, perhaps suddenly. Many methods of coping have been utilized, but at the end of the day it seems like something that human psychology is just… equipped to handle. x-risk is much worse than personal mortality, but you know, failure to multiply and all that.
Idk I’m a doomer and I haven’t been able to handle it well at all. If I were told “You have cancer, you’re expected to live 5-10 more years”, I’d at least have a few comforts
I’d know that I would be missed, by my family at least, for a few years.
I’d know that, to some extent, my “work would live on” in the form of good deeds I’ve done, people I’ve impacted through effective altruism.
I’d have the comfort of knowing that even I’d been dead for centuries I could still “live on” in the sense that other humans (and indeed, many nonhuman) would share brain design with me, and have drives for food, companionship, empathy, curiosity ect. A super AI by contrast, is just so alien and cold that I can’t consider it my brainspace cousin.
If I were to share my cancer diagnosis with normies, I would get sympathy. But there are very few “safe spaces” where I can share my fear of UFAI risk without getting looked at funny.
The closest community I’ve found are the environmentalist doomers, and although I don’t actually think the environment is close to collapse, I do find it somewhat cathartic to read other people’s accounts of being sad that world is going to die.