“Temptations are bound to come, but woe to anyone through whom they come.” Or to translate from New Testament into something describing the current situation: you should accept that AI will come, but you shouldn’t be the one who hastens its coming.
Yes, this approach sounds very simple and naive. The people in this email exchange rejected it and went for a more sophisticated one: join the arms race and try to steer it. By now we see that these ultra-smart and ultra-rich people made things much worse than if they’d followed the “do no evil” approach. If this doesn’t prove the “do no evil” approach, I’m not sure what will.
They’ve shortened timelines, yes, but what’s really the counterfactual here?
”Getting AGI 5 years later with more compute overhang and controlled by people that don’t care about the longterm future of humanity because anyone that did ran away” doesn’t sound like an obviously better world.
“Temptations are bound to come, but woe to anyone through whom they come.” Or to translate from New Testament into something describing the current situation: you should accept that AI will come, but you shouldn’t be the one who hastens its coming.
Yes, this approach sounds very simple and naive. The people in this email exchange rejected it and went for a more sophisticated one: join the arms race and try to steer it. By now we see that these ultra-smart and ultra-rich people made things much worse than if they’d followed the “do no evil” approach. If this doesn’t prove the “do no evil” approach, I’m not sure what will.
They’ve shortened timelines, yes, but what’s really the counterfactual here?
”Getting AGI 5 years later with more compute overhang and controlled by people that don’t care about the longterm future of humanity because anyone that did ran away” doesn’t sound like an obviously better world.