the fervor in this thread seemed to me completely unjustified. [...] My main problem is how far I should go to neglect other problems in favor of some high-impact low-probability event.
I agree with SIAI’s goals. I don’t see it as “fervor”. I see it as: I can do something to make this world a better place (according to my own understanding, in a better way than any other possible), therefore I will do so.
I compartmentalize. Humans are self-contradictory in many ways. I can send my entire bank account to some charity in the hopes of increasing the odds of friendly AI, and I can buy a hundred dollar bottle of bourbon for my own personal enjoyment. Sometimes on the same day. I’m not ultra-rational or pure utilitarian. I’m a regular person with various drives and desires. I save frogs from my stairwell rather than driving straight to work and earning more money. I do what I can.
I agree with SIAI’s goals. I don’t see it as “fervor”. I see it as: I can do something to make this world a better place (according to my own understanding, in a better way than any other possible), therefore I will do so.
I compartmentalize. Humans are self-contradictory in many ways. I can send my entire bank account to some charity in the hopes of increasing the odds of friendly AI, and I can buy a hundred dollar bottle of bourbon for my own personal enjoyment. Sometimes on the same day. I’m not ultra-rational or pure utilitarian. I’m a regular person with various drives and desires. I save frogs from my stairwell rather than driving straight to work and earning more money. I do what I can.