I apologize to anyone offended, but I stand by my statement. I do believe that the space of possible minds is bigger than any individual mind can conceive.
Your ideas are directed either to AI’s halting or multigenerational AI-civilization. As our identity concept is human-only, generations of AIs may more look like one AI. So the question boils down to the question of continuity of intelligent life.
However, this is not exactly what I wanted to ask. I was more interested into relation of two potential infinities: infinite IQ of very advance AI, and infinite possible future time needed for “immortality”.
It all again boils down to the scholastic question: “Could God create a stone so heavy that he can’t lifted it”, which is basically a question about infinite capabilities and infinite complexity of problems (https://en.wikipedia.org/wiki/Omnipotence_paradox).
Why I ask it? Because some times in discussion I see an appeal to superintelligent AI’s omnipotence (like it will be able almost instantly convert galaxies to quasars or travel with light speed).
What do you mean by infinite IQ? If I take you literally, that’s impossible because the test outputs real numbers. But maybe you mean “unbounded optimization power as time goes to infinity” or something similar.
I apologize to anyone offended, but I stand by my statement. I do believe that the space of possible minds is bigger than any individual mind can conceive.
Your ideas are directed either to AI’s halting or multigenerational AI-civilization. As our identity concept is human-only, generations of AIs may more look like one AI. So the question boils down to the question of continuity of intelligent life.
However, this is not exactly what I wanted to ask. I was more interested into relation of two potential infinities: infinite IQ of very advance AI, and infinite possible future time needed for “immortality”.
It all again boils down to the scholastic question: “Could God create a stone so heavy that he can’t lifted it”, which is basically a question about infinite capabilities and infinite complexity of problems (https://en.wikipedia.org/wiki/Omnipotence_paradox).
Why I ask it? Because some times in discussion I see an appeal to superintelligent AI’s omnipotence (like it will be able almost instantly convert galaxies to quasars or travel with light speed).
What do you mean by infinite IQ? If I take you literally, that’s impossible because the test outputs real numbers. But maybe you mean “unbounded optimization power as time goes to infinity” or something similar.