Anyone have reading recommendations for fiction or even just a summary description of what a positive future with AI looks like? I’ve been trying to decide what to work on for the rest of my career. I really want to work on genetics, but worry that, like every other field, it’s basically going to become irrelevant since AI will do everything in the future.
I like to think that depictions of good life after AGI are just called slice of life stories. Just find a story about three friends baking a cake and add “also, most of the production of ingredients was handled by robots.” Any story that doesn’t hinge on someone being poor or in danger is valid post-scarcity. This eliminates a huge fraction of all stories we tell, but a much smaller fraction of the stories you’d actually like to have happen to you.
I’m not sure of any slice of life stories that actual do have the “also, robots” conceit, though. Maybe Questionable Content?
It seems unlikely to me that the things we do post-AGI would remain the same. If you had the lamp from Aladdin and the genie actually worked as described, would your life remain the same? Would you still spend your time baking cakes?
I know for myself personally I would try to enhance the capabilities of myself and those I care about (assuming they were amenable). To comprehend calculus as Newton did, or numbers as Ramanujan did would I think be an experience far more interesting than baking cakes or taking part in my usual hobbies. And there are thousands of other ways in which I would use my allotment of AI power to change my own experience.
I suspect this would be true for many people, so that self-augmentation via AGI would fundamentally change the experience of being human.
What does such a world look like? I have a very hard time visualizing it. Would power tend to concentrate even more than it does now? How would AI deal reconcile competing human interests?
Good points. I was imagining some successful slow takeoff scenario where there’s a period of post-scarcity with basically human control of the future (reminds me of the Greg Egan story Border Guards.). But late into a slow takeoff, or full-on post-singleton, the full transhumanist package will be realizable.
I’m not so sure that learning to love numbers at the expense of my current hobbies is all that great an idea. Sure, my future self would like it, but right now I don’t love numbers that much. I think a successful future would need some anti-wireheading guardrails that would make it difficult to learn to love math in a way that really eclipsed all your previous interests.
Eh. I think it might fit in nicely under the time you might currently spend doing a crossword puzzle or sudoku. Living for longer arguably allows for ‘doing something just for a little while’ paying off in a bigger way (where it was previously more constrained by lifetime length).
Also to some extent there’s ‘integration’ - better memory doesn’t necessarily mean you love memorizing things for contests, and ‘that’s the only thing do now’. Maybe instead you just bake without a recipe if you want to do something you’ve done before. Or you remember more sports plays and use them despite continuing to play ‘just for fun’.
If you gained more appreciation for artwork, that wouldn’t necessarily ‘change your entire life’. Instead you might go to an art museum once in a while.
(I also don’t see why you’re afraid of becoming an artist. Oh no, my values might change and I might become Michelangelo! What? Why are you already worried about that? Do you think you are predisposed to getting addicted to that? Why?
Are you a recovering mathematician or something? (I also don’t know what your hobbies are, and why they wouldn’t mix with each other—math problems have to come from somewhere.)
Me, no. People who like doing that, yes. That’s not to say it would necessarily last forever, but things (and people) change over time. I also think there’s something different about people who, for example:
1. Buy furniture, versus
2. Go out and make it.
Arguably, people having less constraints might mean more of 2.
How would AI deal reconcile competing human interests?
Everyone says the Culture novelsare the best example of an AI utopia, but even though it’s a cliché to mention the culture, it’s a cliché for a good reason. Don’t start with Consider Phlebas (the first one), but otherwise just dive in. My other recommendation is the Commonwealth Saga by Peter F Hamilton and the later Void Trilogy—it’s not on the same level of writing quality as the Culture, although still a great story, but it depicts an arguably superior world to that of the Culture—with more unequivocal support of life extension and transhumanism.
The Commonwealth has effective immortality, a few downsides of it are even noticeable (their culture and politics is a bit more stagnant than we might like), but there’s never any doubt at all that it’s worth it, and it’s barely commented on in the story. The latter-day Void Trilogy Commonwealth is probably the closest a work of published fiction has come to depicting a true eudaemonic utopia that lacks the problems of the culture.
Anyone have reading recommendations for fiction or even just a summary description of what a positive future with AI looks like? I’ve been trying to decide what to work on for the rest of my career. I really want to work on genetics, but worry that, like every other field, it’s basically going to become irrelevant since AI will do everything in the future.
I like to think that depictions of good life after AGI are just called slice of life stories. Just find a story about three friends baking a cake and add “also, most of the production of ingredients was handled by robots.” Any story that doesn’t hinge on someone being poor or in danger is valid post-scarcity. This eliminates a huge fraction of all stories we tell, but a much smaller fraction of the stories you’d actually like to have happen to you.
I’m not sure of any slice of life stories that actual do have the “also, robots” conceit, though. Maybe Questionable Content?
It seems unlikely to me that the things we do post-AGI would remain the same. If you had the lamp from Aladdin and the genie actually worked as described, would your life remain the same? Would you still spend your time baking cakes?
I know for myself personally I would try to enhance the capabilities of myself and those I care about (assuming they were amenable). To comprehend calculus as Newton did, or numbers as Ramanujan did would I think be an experience far more interesting than baking cakes or taking part in my usual hobbies. And there are thousands of other ways in which I would use my allotment of AI power to change my own experience.
I suspect this would be true for many people, so that self-augmentation via AGI would fundamentally change the experience of being human.
What does such a world look like? I have a very hard time visualizing it. Would power tend to concentrate even more than it does now? How would AI deal reconcile competing human interests?
Good points. I was imagining some successful slow takeoff scenario where there’s a period of post-scarcity with basically human control of the future (reminds me of the Greg Egan story Border Guards.). But late into a slow takeoff, or full-on post-singleton, the full transhumanist package will be realizable.
I’m not so sure that learning to love numbers at the expense of my current hobbies is all that great an idea. Sure, my future self would like it, but right now I don’t love numbers that much. I think a successful future would need some anti-wireheading guardrails that would make it difficult to learn to love math in a way that really eclipsed all your previous interests.
Eh. I think it might fit in nicely under the time you might currently spend doing a crossword puzzle or sudoku. Living for longer arguably allows for ‘doing something just for a little while’ paying off in a bigger way (where it was previously more constrained by lifetime length).
Also to some extent there’s ‘integration’ - better memory doesn’t necessarily mean you love memorizing things for contests, and ‘that’s the only thing do now’. Maybe instead you just bake without a recipe if you want to do something you’ve done before. Or you remember more sports plays and use them despite continuing to play ‘just for fun’.
TL:DR;
If you gained more appreciation for artwork, that wouldn’t necessarily ‘change your entire life’. Instead you might go to an art museum once in a while.
(I also don’t see why you’re afraid of becoming an artist. Oh no, my values might change and I might become Michelangelo! What? Why are you already worried about that? Do you think you are predisposed to getting addicted to that? Why?
Are you a recovering mathematician or something? (I also don’t know what your hobbies are, and why they wouldn’t mix with each other—math problems have to come from somewhere.)
)
Me, no. People who like doing that, yes. That’s not to say it would necessarily last forever, but things (and people) change over time. I also think there’s something different about people who, for example:
1. Buy furniture, versus
2. Go out and make it.
Arguably, people having less constraints might mean more of 2.
What does this mean?
It was a typo. It was meant to say “How would AI reconcile competing human interests?”
Everyone says the Culture novels are the best example of an AI utopia, but even though it’s a cliché to mention the culture, it’s a cliché for a good reason. Don’t start with Consider Phlebas (the first one), but otherwise just dive in. My other recommendation is the Commonwealth Saga by Peter F Hamilton and the later Void Trilogy—it’s not on the same level of writing quality as the Culture, although still a great story, but it depicts an arguably superior world to that of the Culture—with more unequivocal support of life extension and transhumanism.
The Commonwealth has effective immortality, a few downsides of it are even noticeable (their culture and politics is a bit more stagnant than we might like), but there’s never any doubt at all that it’s worth it, and it’s barely commented on in the story. The latter-day Void Trilogy Commonwealth is probably the closest a work of published fiction has come to depicting a true eudaemonic utopia that lacks the problems of the culture.