While I recognize that in the story it’s assumed alignment succeeds, I’m curious on a couple worldbuilding points.
First, about this stage of AI development:
His work becomes less stressful too — after AIs surpass his coding abilities, he spends most of his time talking to users, trying to understand what problems they’re trying to solve.
The AIs in the story are really good at understanding humans. How does he retain this job when it seems like AIs would do it better? Are AIs just prevented from taking over society from humans through a combination of alignment and some legal enforcement?
Second, by the end of the story, it seems like AIs are out of the picture entirely, except perhaps as human-like members of the hivemind. What happened to them?
In other words: I’d like to know what kind of alignment or legal framework you think could get us to this kind of utopia.
EDIT: I found this tweet from someone who says they just interviewed Richard Ngo. Full interview isn’t out yet, but the tweet says that when asked about ways in which his stories seem unrealistic, Richard Ngo:
wasn’t attached to them, nor did he say “these ideas are going to happen” or “these ideas should make you feel like AGI risk isn’t a big deal.” He juggles with ideas with a light touch, which was cool.
So it would seem that my questions don’t have answers. Fair enough, I suppose.
I like it. Thanks for sharing.
(spoilers below)
While I recognize that in the story it’s assumed alignment succeeds, I’m curious on a couple worldbuilding points.
First, about this stage of AI development:
The AIs in the story are really good at understanding humans. How does he retain this job when it seems like AIs would do it better? Are AIs just prevented from taking over society from humans through a combination of alignment and some legal enforcement?
Second, by the end of the story, it seems like AIs are out of the picture entirely, except perhaps as human-like members of the hivemind. What happened to them?
In other words: I’d like to know what kind of alignment or legal framework you think could get us to this kind of utopia.
EDIT: I found this tweet from someone who says they just interviewed Richard Ngo. Full interview isn’t out yet, but the tweet says that when asked about ways in which his stories seem unrealistic, Richard Ngo:
So it would seem that my questions don’t have answers. Fair enough, I suppose.