I don’t remeber who said it, but building AI isn’t just about power dynamics or a bit of efficiency.
It’s about whether humanity should keep doing things.
Civilization (feels like it?) stagnated and degraded for the last decades (the main technological upgrade being the cause of social degradation).
We haven’t solved cancer, can’t regrow limbs, people are unhealthy, commuting to work is unpleasant and work weeks are long. The list can go on.
Humans make tools do let them do better and more work. Humans even set up full automation of certain things. Now humans are looking to fully automate humans, perhaps because we don’t believe in the human race. (I think EY and doomers generally are the same as the accelerationists, neither has faith in humanity).
What could humans make that would restore faith- faith that we could compete with AGIs, faith that we can get out of stagnation without replacing humans, faith that we can make the world of humans a better one?
A tech advance, an organizational efficiency advance, quality of life, something else?
When you say ~zero value, do you mean hyperbolically dicounted or something more extreme?