It is very, very clear that at present rates of progress, adding that level of alignment capability as grown over the next N years, to the AGI capability that arrives after N years, results in everybody dying very quickly.
This is a claim I basically agree with.
I don’t think the situation is entirely hopeless, but I don’t think any of the current plans (or the current alignment field) are on track to save us.
Good to hear I’m not the only one with this reaction. Well, good isn’t the right term, but y’know. The This Is Fine meme comes to mind. So does the stuff about how having strong feelings is often right and proper.
I’ve always thought this but have never said it to anyone before: I can only imagine the type of stress and anxiety Eliezer deals with. I’m grateful to him for many reasons, but one distinct reason is that he puts up with this presumed stress and anxiety for humanity’s benefit, which includes all of us.
Maybe this would be a good time for the community to apply a concentration of force and help.
Aaaaaaaaaaaaahhhhhhhhhhhhhhhhh!!!!!!!!!!!!
(...I’ll be at the office, thinking about how to make enough progress fast enough.)
Follow-up
One of Eliezer’s claims here is
This is a claim I basically agree with.
I don’t think the situation is entirely hopeless, but I don’t think any of the current plans (or the current alignment field) are on track to save us.
Good to hear I’m not the only one with this reaction. Well, good isn’t the right term, but y’know. The This Is Fine meme comes to mind. So does the stuff about how having strong feelings is often right and proper.
I’ve always thought this but have never said it to anyone before: I can only imagine the type of stress and anxiety Eliezer deals with. I’m grateful to him for many reasons, but one distinct reason is that he puts up with this presumed stress and anxiety for humanity’s benefit, which includes all of us.
Maybe this would be a good time for the community to apply a concentration of force and help.