If, as you imply, you find it compelling that being a danger to the rest of humankind is something to avoid, then presumably you should also find it compelling that reducing other such dangers is something to seek out.
And it’s pretty clear that projects that require more than a day’s effort to show results will never be undertaken by a system with no goals larger than day-to-day optimization.
It seems to follow from there that if there exist dangers to humankind that require more than a day’s effort to measurably reduce, it’s a good idea to have goals larger than day-to-day optimization.
One possible line of argument:
If, as you imply, you find it compelling that being a danger to the rest of humankind is something to avoid, then presumably you should also find it compelling that reducing other such dangers is something to seek out.
And it’s pretty clear that projects that require more than a day’s effort to show results will never be undertaken by a system with no goals larger than day-to-day optimization.
It seems to follow from there that if there exist dangers to humankind that require more than a day’s effort to measurably reduce, it’s a good idea to have goals larger than day-to-day optimization.