“Or else what would we do with the future? What would we do with the billion galaxies in the night sky? Fill them with maximally efficient replicators? Should our descendants deliberately obsess about maximizing their inclusive genetic fitness, regarding all else only as a means to that end?”
Won’t our descendants who do have genes or code that causes them to maximize their genetic fitness come to dominate the billions of galaxies. How can there be any other stable long term equilibrium in a universe in which many lifeforms have the ability to choose their own utility functions?
Genetic fitness refers to reproduction of individuals. The future will not have a firm concept of individuals. What is relevant is control of resources; this is independent of reproduction.
Furthermore, what we think of today as individuality, will correspond to information in the future. Reproduction will correspond to high mutual information. And high mutual information in your algorithms leads to inefficient use of resources. Therefore, evolution, and competition, will at least in this way go against the future correlate of “genetic fitness”.
Wow, too big an inferential distance Phil. No idea what you are tallking about here “what we think of today as individuality, will correspond to information in the future.”
Would you mind giving a few more details? Curiosity striking...
I’m very wary of this post for being so vague and not linking to an argument, but I’ll throw my two cents in. :)
The future will not have a firm concept of individuals.
I see two ways to interpret this:
You could see it as individuals being uploaded to some giant distributed AI—individual human minds coalescing into one big super-intelligence, or being replaced by one; or
Having so many individuals that the entire idea of worrying about 1 person, when you have 100 billion people per planet per quadrant or whatever, becomes laughable.
The common thread is that “individuality” is slowly being supplanted by “information”—specifically that you, as an individual, only become so because of your unique inflows of information slowly carving out pathways in your mind, like how water randomly carves canyons over millions of years. In a giant AI, all the varying bits that make up one human from another would get crosslinked, in some immense database that would make Jorge Luis Borges blush; meanwhile, in a civilization of huge, huge populations, the value of those varying bits simply goes down, because it becomes increasingly unlikely that you’ll actually be unique enough to matter on an individual level. So, the next bottleneck in the spread of civilization becomes resources.
This is probably my first comment on this site—feel free to browbeat me if I didn’t get my point across well enough.
Eliezer, you wrote:
“Or else what would we do with the future? What would we do with the billion galaxies in the night sky? Fill them with maximally efficient replicators? Should our descendants deliberately obsess about maximizing their inclusive genetic fitness, regarding all else only as a means to that end?”
Won’t our descendants who do have genes or code that causes them to maximize their genetic fitness come to dominate the billions of galaxies. How can there be any other stable long term equilibrium in a universe in which many lifeforms have the ability to choose their own utility functions?
Genetic fitness refers to reproduction of individuals. The future will not have a firm concept of individuals. What is relevant is control of resources; this is independent of reproduction.
Furthermore, what we think of today as individuality, will correspond to information in the future. Reproduction will correspond to high mutual information. And high mutual information in your algorithms leads to inefficient use of resources. Therefore, evolution, and competition, will at least in this way go against the future correlate of “genetic fitness”.
Wow, too big an inferential distance Phil. No idea what you are tallking about here “what we think of today as individuality, will correspond to information in the future.”
Would you mind giving a few more details? Curiosity striking...
I’ve been lurking for a while, and this is my first post, but:
FTFY. Instead of asking for a single detailed story, we should ask for many simple alternative stories, no?
Obviously, this doesn’t countermand your complaint about inferential distance, which I totally agree with.
Still waiting for OP to deliver...
It’s probably just something stupid like he thinks humans will upload on computers and he thinks he knows how future society-analogues will function.
This /seems/ to contain great insight that I can’t comprehend yet. Yes, please, how do I learn to see what you see?
I’m very wary of this post for being so vague and not linking to an argument, but I’ll throw my two cents in. :)
I see two ways to interpret this:
You could see it as individuals being uploaded to some giant distributed AI—individual human minds coalescing into one big super-intelligence, or being replaced by one; or
Having so many individuals that the entire idea of worrying about 1 person, when you have 100 billion people per planet per quadrant or whatever, becomes laughable.
The common thread is that “individuality” is slowly being supplanted by “information”—specifically that you, as an individual, only become so because of your unique inflows of information slowly carving out pathways in your mind, like how water randomly carves canyons over millions of years. In a giant AI, all the varying bits that make up one human from another would get crosslinked, in some immense database that would make Jorge Luis Borges blush; meanwhile, in a civilization of huge, huge populations, the value of those varying bits simply goes down, because it becomes increasingly unlikely that you’ll actually be unique enough to matter on an individual level. So, the next bottleneck in the spread of civilization becomes resources.
This is probably my first comment on this site—feel free to browbeat me if I didn’t get my point across well enough.