It is possible that AI would allow for the creation of brain-computer interfaces such that we can productively merge with AI systems. I don’t think this would apply in that case since that would be a true “augmentation.”
If that doesn’t happen, though, or before that happens, I think this is a real possibility. The disanalogy is that our brains wouldn’t add anything to sufficiently advanced AI systems, unlike books, which are useless without our brains to read them.
Today, many people are weaker physically than in previous times because we don’t need to do as much physical labor. I don’t see why the same couldn’t happen with our minds. Of course, many people go to the gym, and people will probably also continue to learn things to keep sharp. If that becomes a strong and widespread cultural norm, then we wouldn’t have this problem. But it doesn’t seem guaranteed that would happen.
The disanalogy is that our brains wouldn’t add anything to sufficiently advanced AI systems
Being human is intrinsically valuable. For certain tasks, AI simply cannot replace us. Many people enjoy watching Magnus Carlsen play chess even though a $5 Rasberry PI computer is better at chess than him.
Similarly, there are more horses in the USA today than in the 1930s when the Model-T was introduced.
Today, many people are weaker physically than in previous times because we don’t need to do as much physical labor.
I haven’t been able to find a definitive source, but I would be willing to bet that a typical “gym bro” is physically stronger than a typical hunter-gatherer due to better diet/training.
Intrinsically valuable to a human run society. The laws of physics value an entity able to replicate itself across all accessible areas of reality—presumably at a minimum this galaxy and nearby ones- as efficiently as possible has the most value. Such optimized machinery is what Zvi calls “destroying all value in the universe” because presumably the optimal solution puts all it’s energy into replication and has the least amount of intelligence or inner thoughts or art or culture required.
The wake of such an event leaves the universe tiled with repeating subunits that do nothing but stockpile resources for no purpose than to exist as long as possible after the heat death of the universe.
It is possible that AI would allow for the creation of brain-computer interfaces such that we can productively merge with AI systems. I don’t think this would apply in that case since that would be a true “augmentation.”
If that doesn’t happen, though, or before that happens, I think this is a real possibility. The disanalogy is that our brains wouldn’t add anything to sufficiently advanced AI systems, unlike books, which are useless without our brains to read them.
Today, many people are weaker physically than in previous times because we don’t need to do as much physical labor. I don’t see why the same couldn’t happen with our minds. Of course, many people go to the gym, and people will probably also continue to learn things to keep sharp. If that becomes a strong and widespread cultural norm, then we wouldn’t have this problem. But it doesn’t seem guaranteed that would happen.
Being human is intrinsically valuable. For certain tasks, AI simply cannot replace us. Many people enjoy watching Magnus Carlsen play chess even though a $5 Rasberry PI computer is better at chess than him.
Similarly, there are more horses in the USA today than in the 1930s when the Model-T was introduced.
I haven’t been able to find a definitive source, but I would be willing to bet that a typical “gym bro” is physically stronger than a typical hunter-gatherer due to better diet/training.
Intrinsically valuable to a human run society. The laws of physics value an entity able to replicate itself across all accessible areas of reality—presumably at a minimum this galaxy and nearby ones- as efficiently as possible has the most value. Such optimized machinery is what Zvi calls “destroying all value in the universe” because presumably the optimal solution puts all it’s energy into replication and has the least amount of intelligence or inner thoughts or art or culture required.
The wake of such an event leaves the universe tiled with repeating subunits that do nothing but stockpile resources for no purpose than to exist as long as possible after the heat death of the universe.
No.
Things humans value will continue to be valuable because we are going to solve the alignment problem
Even if AI gained all of the power it would still find us interesting for the same reason we find dinosaurs interesting
Even if you are right and the max-entropy state of the universe is a literal paper clipper it would be way more interesting than you are describing
I hope you are correct but timelines exist where you are not
same as (1)
I also hope you are correct but optimality is whatever is optimal