If someone has some money, they can invest it to get more money. Do you know what the difference is between money and intelligence that makes it plausible to expect an abrupt intelligence explosion, but reasonable to expect steady exponential growth for financial investment returns?
One answer is that using your intelligence to to improve your own cognitive architecture is an entirely new field of investment. The economic growth that accrues from modern investing looks steady from inside the modern economy, but it’s explosive from the perspective of a premodern society.
One distinction that may matter here is that something is only counted for as money if it is actually traded between distinct agents. If I got 10 000 times better at making houses from scratch and made a house for me but no one else, there would be no increase in the local economy. If I started selling houses, this would be different.
Superintelligence can be thought of as a large island, where there is a lot of internal trade going on that is invisible to the outside economy. If that was counted as value, then the economy would also be exploding. This internal trade opacity that is proper of intelligence and other intra-agent changes seems to be responsible for the distinctive growth extrapolations.
While not exactly investment, consider the case of an AI competing with a human to devise a progressively better high-frequency trading strategy. An AI would probably:
be able to bear more things in mind at one time than the human
evaluate outcomes faster than the human
be able to iterate on its strategies faster than the human
I expect the AI’s superior capacity to “drink from the fire hose” together with its faster response time to yield a higher exponent for the growth function than that resulting from the human’s iterative improvement.
You’re right, that is more realistic. Even so, I get the feeling that the human would have less and less to do as time goes on. I quote:
“He just loaded up on value stocks,” says Mr. Fleiss, referring to the AI program. The fund gained 41% in 2009, more than doubling the Dow’s 19% gain.
As another data point, a recent chess contest between a chess grandmaster (Daniel Naroditsky) working together with an older AI (Rybka, rated ~3050) and the current best chess AI (Stockfish 5, rated 3290) ended with a 3.5 − 0.5 win for Stockfish.
I don’t think an article which compares a hedge fund’s returns to the Dow (a price-weighted index of about 30 stocks!) can be considered very credible. And there are fewer Quant funds, managing less money, than there were 7 years ago.
One difference is hardware overhang. When new AI technologies are created, many times the amount of hardware they need is available to run them.
However, if an international agreement was reached in advance, and some of the AGI sandbox problems were solved, we might be able to restrict AGI technology to a bounded amount of hardware for a time-if theoretical results which we do not have yet showed that was the appropriate course of action.
There are some qualitative shifts in growth rate with investment. For example, very small amounts of money basically cannot be invested in the financial due to the high fixed costs associated with money laundering regulation, etc.
On the other hand, the more income people have, the more they consume, which reduces growth. Levels of consumption deemed adequate by our ancestors, and indeed consistent with high savings rates, are not viewed as acceptable any more.
Market is more or less stabilized. There are powers and superpowers in some balance. (gain money sometimes could be illusion like bet (and get) more and more in casino).
If you are thinking about money making—you have to count sum of all money in society. If investments means bigger sum of values or just exchange in economic wars or just inflation. (if foxes invest more to hunting and eat more rabbits, there could be more foxes right? :)
In AI sector there is much higher probability of phase-transition (=explosion). I think that’s the diference.
How?
Possibility: There could be probably enough HW and we just wait for spark of new algorithm.
Possibility: If we count agriculture revolution as explosion—we could also count massive change in productivity from AI (which is probably obvious).
If someone has some money, they can invest it to get more money. Do you know what the difference is between money and intelligence that makes it plausible to expect an abrupt intelligence explosion, but reasonable to expect steady exponential growth for financial investment returns?
One answer is that using your intelligence to to improve your own cognitive architecture is an entirely new field of investment. The economic growth that accrues from modern investing looks steady from inside the modern economy, but it’s explosive from the perspective of a premodern society.
One distinction that may matter here is that something is only counted for as money if it is actually traded between distinct agents. If I got 10 000 times better at making houses from scratch and made a house for me but no one else, there would be no increase in the local economy. If I started selling houses, this would be different.
Superintelligence can be thought of as a large island, where there is a lot of internal trade going on that is invisible to the outside economy. If that was counted as value, then the economy would also be exploding. This internal trade opacity that is proper of intelligence and other intra-agent changes seems to be responsible for the distinctive growth extrapolations.
While not exactly investment, consider the case of an AI competing with a human to devise a progressively better high-frequency trading strategy. An AI would probably:
be able to bear more things in mind at one time than the human
evaluate outcomes faster than the human
be able to iterate on its strategies faster than the human
I expect the AI’s superior capacity to “drink from the fire hose” together with its faster response time to yield a higher exponent for the growth function than that resulting from the human’s iterative improvement.
A more realistic example would be “competing with a human teamed up with a narrow AI”.
You’re right, that is more realistic. Even so, I get the feeling that the human would have less and less to do as time goes on. I quote:
As another data point, a recent chess contest between a chess grandmaster (Daniel Naroditsky) working together with an older AI (Rybka, rated ~3050) and the current best chess AI (Stockfish 5, rated 3290) ended with a 3.5 − 0.5 win for Stockfish.
I don’t think an article which compares a hedge fund’s returns to the Dow (a price-weighted index of about 30 stocks!) can be considered very credible. And there are fewer Quant funds, managing less money, than there were 7 years ago.
One difference is hardware overhang. When new AI technologies are created, many times the amount of hardware they need is available to run them.
However, if an international agreement was reached in advance, and some of the AGI sandbox problems were solved, we might be able to restrict AGI technology to a bounded amount of hardware for a time-if theoretical results which we do not have yet showed that was the appropriate course of action.
We have all kinds of work to do.
There are some qualitative shifts in growth rate with investment. For example, very small amounts of money basically cannot be invested in the financial due to the high fixed costs associated with money laundering regulation, etc.
On the other hand, the more income people have, the more they consume, which reduces growth. Levels of consumption deemed adequate by our ancestors, and indeed consistent with high savings rates, are not viewed as acceptable any more.
Market is more or less stabilized. There are powers and superpowers in some balance. (gain money sometimes could be illusion like bet (and get) more and more in casino).
If you are thinking about money making—you have to count sum of all money in society. If investments means bigger sum of values or just exchange in economic wars or just inflation. (if foxes invest more to hunting and eat more rabbits, there could be more foxes right? :)
In AI sector there is much higher probability of phase-transition (=explosion). I think that’s the diference.
How?
Possibility: There could be probably enough HW and we just wait for spark of new algorithm.
Possibility: If we count agriculture revolution as explosion—we could also count massive change in productivity from AI (which is probably obvious).