Hey! Thanks for sharing the debate with LeCun, I found it very interesting and I’ll do more research on his views.
Thanks for pointing out that even a 1% existential risk is worth worrying about, I imagine it’s true even in my moral system, if I just realize that ie 1% probability that humanity wipes = 70 million expected deaths (1% of 7 billions) plus all the expected humans that wouldn’t come to be.
That’s logically.
Emotionally, I find it WAY harder to care for a 1% X-risk. Scope insensitivity. I want to think about where else in my thinking this is causing output errors.
Hey! Thanks for sharing the debate with LeCun, I found it very interesting and I’ll do more research on his views.
Thanks for pointing out that even a 1% existential risk is worth worrying about, I imagine it’s true even in my moral system, if I just realize that ie 1% probability that humanity wipes = 70 million expected deaths (1% of 7 billions) plus all the expected humans that wouldn’t come to be.
That’s logically.
Emotionally, I find it WAY harder to care for a 1% X-risk. Scope insensitivity. I want to think about where else in my thinking this is causing output errors.