Quick feedback since nobody else has commented—I’m all for the AI Safety appearing “not just a bunch of crazy lunatics, but an actually sensible, open and welcoming community.”
But the spirit behind this post feels like it is just throwing in the towel, and I very much disapprove of that. I think this is why I and others downvoted too
well, I am not arguing for ceasing the agi safety efforts or that it is unlikely they would succeed. I am just claiming that if there is a high enough chance that they might be unsuccessful...we might as well try to make some relatively cheap and simple effort to make this case somewhat more pleasant(although fair enough that the post might be too direct).
Imagine that you had an illness with a 30% chance of death in next 7 years(I hope you don’t), it would likely affect your behaviour and you would want to spend your time differently and maybe create some memorable experiences even though the chance that you survive is still high enough.
Despite this, it seems surprising, that when it comes to AGI-related risks, such tendencies to live life differently are much weaker, even though many assign similar probabilities. Is it rational?
Quick feedback since nobody else has commented—I’m all for the AI Safety appearing “not just a bunch of crazy lunatics, but an actually sensible, open and welcoming community.”
But the spirit behind this post feels like it is just throwing in the towel, and I very much disapprove of that. I think this is why I and others downvoted too
well, I am not arguing for ceasing the agi safety efforts or that it is unlikely they would succeed. I am just claiming that if there is a high enough chance that they might be unsuccessful...we might as well try to make some relatively cheap and simple effort to make this case somewhat more pleasant(although fair enough that the post might be too direct).
Imagine that you had an illness with a 30% chance of death in next 7 years(I hope you don’t), it would likely affect your behaviour and you would want to spend your time differently and maybe create some memorable experiences even though the chance that you survive is still high enough.
Despite this, it seems surprising, that when it comes to AGI-related risks, such tendencies to live life differently are much weaker, even though many assign similar probabilities. Is it rational?