I’m not sure that I see a strong demand-side argument here against exponential growth.
Even if demand for one specific technology falls (for example, people care less about having hard drive space), then you would expect demand for other technological advancements to increase; after all, demand is basically limited by consumer wealth, and in the long run if consumers are spending less money on something, they’ll be spending more money on something else, which will encourage technological improvements in that field.
Overall, I think the exponential growth story on a socital level is larger then the computers/ Moore’s law type of exponential growth you’re focusing on here. What I would say is that exponential growth comes from something like this:
Assumption 1. As technology improves and science advances, and as existing technologies become more widely deployed and cheaper and more uses are found for them, the total resources avalable to society as a whole increase.
Assumption 2. Society will continue to invest a certain percentage of it’s resources into advances in science and technology,
It’s basically the ‘compound interest’ model of exponential growth; more investment capital we have as a species as a whole means more investment payoff, which then gives us more investment capital.
“Resources” is a vague term, but it seems true to me in a number of senses. Better technology means better scientific instruments, which means faster advances in science. More advances in science means more options opening up for technological progress. More technology also means a larger economy, with more industrial production, more efficient food production, more overall resources. More resources can be devoted to education, which increases the mental resources of the human race. More access to information, ect.
Basically, even if demand for consumer computer technology slows down, then I would just expect that capital to go into other kinds of research (industrial computer technology, or biotech, or energy research, or pure science, ect), which would then increase the total capacity of the human race, and accelerate overall research. And then, even if the demand for improved computer technology doesn’t justify spending X money right now, that same research will be both easier and cheaper to do, and require a much smaller relative use of resources, in the future. So even if computer technology slows down for the short term, I would expect overall exponential growth to continue to accelerate, until it drags computer technology along with it.
I think that’s a common phenomenon; right now, we as a society are investing a huge amount of our resources into computer technology, and as a side effect that’s dragging along technology improvement in dozens of other areas, everything from biology to automobiles. If we shift our focus, then overall advancement should continue, and likely would drag along technology advancement in everything, including computers. (Perhaps biological research or research into materials science suggests better ways of constructing computers, for example).
I do think economic growth will continue to be exponential over short time horizons, though the exponent itself might change over time (it’s unclear whether the change will be in the positive or negative direction). My focus here was on specific technologies whose continued exponential growth for the next 30 years or so is used as an argument for the imminence of a technological singularity.
As we shift from one paradigm of advancement to another, we may still have exponential growth, but the exponent for the new exponential growth paradigm may be quite different.
Fair enough. In that case, though, I think you then have to consider the possibilities that other forms of technological development might themselves lead to a singularity of a different type (biotech, for example, seems quite possible), or might at least lower the barrier and make it easier for someone to improve computer technology with fewer resources, making it profitable for people to continue to improve computers even with a lower payoff in forms of consumer demand.
That is; if there’s only X level of consumer demand for “better computers” by whatever definition you want to use, that might not be enough to fund enough research to accomplish that right now, but in an exponentially growing economy with exponentially growing technology and resources, it should cost far less to make that advance in a few years.
So long as the whole economy and the whole mass of human science and technology continues to grow exponentially, I would expect computers to continue to improve exponentially; they may become a “lagging indicator” of progress instead of the cutting edge if other areas get a larger fraction of the research capital investment, but even that should be enough to maintain some kind of exponential curve.
Yes, this is a plausible scenario. I personally put weight on this type of scenario, namely, that progress might stall and then resume once some complementary supply-side and demand-side innovations have been made and other economic progress has happened to support more investment in the area. I don’t think this would be runaway technological progress. I might talk more about this sort of scenario in a future post.
I don’t think this would be runaway technological progress
No reason to think it won’t be runaway technological progress, depending on how you define runaway. The industrial revolution was runaway technological progress. Going from an economic output doubling time of 1000 years to 15 years is certainly runway. The rate of growth ultimately stalled but it was certainly runaway for that transitional period, even though there were stalls along the way.
Edited to add link. If you haven’t already seen a version of this talk by Robin Hanson, the first 20 minutes or so goes into this but it’s interesting throughout if you have time.
http://www.youtube.com/watch?v=uZ4Qx42WQHo
So we’re talking about a human based runaway scenario? That’s not gonna happen.
Um. Runaway progress does not stall by defintion—think about what “runaway” means.
OK, that’s what ‘runaway’ growth means. Can this even be predicted. I think not. How could you possibly ever know that you’re in a runaway? The transition from agriculture to industry saw an increase in economic growth roughly 65 times faster. I think if we saw global output accelerate by even half that in the next 20 years most would be calling a runaway scenario.
We are talking about a runaway scenario in a human civilization, aren’t we?
I don’t think that’s possible. Do you? A runaway means a massive and ongoing boost in productivity. That seems achievable only by AI, full brain emulations, or transhumans that are much smarter and faster at doing stuff than humans can be.
So what does it mean?
I was agreeing (mostly). My point was that by that definition we could never predict, or even know that we are in the middle of, a runway scenario. I did pose it as a question and you did not reply with an answer. So what do you think? If the doubling time in economic output decreased by 35 times over the next 2, or even 4 decades, would you think we are in a runaway scenario?
I did not mean to imply that situation in quote 1 would happen within the timeframe of quote 2, and I don’t think i did. It’s a thought experiment and I think that is clear.
And, by the way, understanding that you lost control is how you know you’re in a runaway scenario.
There are examples of this in real history from smart people who thought we’d lost control—see Samuel Butler. We have, arguably. The extent to which machines are now integral to continued economic prosperity is irreversible without unbearable costs (people will die).
I personally put weight on this type of scenario, namely, that progress might stall and then resume once some complementary supply-side and demand-side innovations have been made and other economic progress has happened to support more investment in the area.
Yeah, so do I.
I’m not sure it makes a lot of difference in terms of long run predictions, though. Let’s say that for the next 10 years, we cut the amount of research we are doing into computers in half in percentage terms (so instead of putting X% of our global GDP into computer research every year, we put X/2%.) Let’s say we take that and instead invest it in other forms of growth (other technologies, biotech, transhuman technologies, science, infrastructure, or even bringing the third world out of poverty and into education, ect) and maintain the current rate of global growth. Let’s further say that the combination of global GDP growth and science and technology growth is roughly 7% a year, so that the global economy is doubled every 10 years in how much it can devote to research. And then at the end of that period, computer research goes back up to X%.
In that case, that 10 year long research slowdown would put us getting to where we “should” have been in computer science in 2044 now happening in 2045 instead; if that’s the point we need to be at to get a singularity started, then that 10 years long research slowdown would only delay the singularity by about 1.75 years. (edit: math error corrected)
And not only that, after a 10 year slowdown into computer science research, I would expect computers to become the new “low hanging fruit”, and we might end up devoting even more resources to it at that point, perhaps eliminating the time loss all together.
Basically, so long as exponential growth continues at all, in technological and economic terms in general, I don’t think the kind of slowdown we’re talking about would have a huge long-term effect on the general trajectory of progress.
As a general observation, you don’t want to model growth (of any sort) as X% per year. You want to model it as a random variable with the mean of X% per year, maybe, and you want to spend some time thinking about its distribution. In particular, whether that distribution is symmetric and how far out do the tails go.
As we shift from one paradigm of advancement to another, we may still have exponential growth, but the exponent for the new exponential growth paradigm may be quite different.
Won’t the rate of economic growth be different (much larger) by definition? I can’t envisage a scenario where economic growth could be roughly as it is now or slower but we have experienced anything even approaching a technological singularity. Think of the change in growth rates resulting from the farming and industrial revolutions.
Something to puzzle over is the fact we have seen computational grunt grow exponentially for decade upon decade yet economic growth has been stable over the same period.
Won’t the rate of economic growth be different (much larger) by definition?
Depends on the reason for the switch to a new paradigm. If the reason is that there are even more attractive options, then economic growth would accelerate. If the reasons are that we’re running out of demand for improvement across the board, and people are more satisfied with their lives, and the technological low-hanging fruit are taken, then economic growth could be lower.
I see. The demand side story. I suppose it is technically feasible but I find it unlikely in the extreme. There is nothing in history to suggest it and I don’t think it fits with psychology. History is full of examples of how we won’t want for anything after we have ‘some foreseen progress’. We’ve had the luxury of being able to trade in some economic growth for more leisure, and still be better off than our grandparents, for a long time now, but haven’t.
If the reasons are that we’re running out of demand for improvement across the board, and people are more satisfied with their lives, and the technological low-hanging fruit are taken, then economic growth could be lower.
Do you mean lower than it is now? After a paradigm shift in advancement?
Something to puzzle over is the fact we have seen computational grunt grow exponentially for decade upon decade yet economic growth has been stable over the same period.
Economic growth itself is an exponential function. “The economy grows 3% every year” is exponential growth, not linear growth. I would say that it’s only happened because of exponential technological progress; we never had that level of exponential growth until the industrial revolution. And I would say that most of the economic growth the first world has had over the past 20 years has come from recent technological advancement, mostly being the twin communication and computer revolutions we’ve had (PC’s, cell phones, internet, smart phones, and some smaller examples of both).
I’m not sure how you got from my comments that I don’t understand exponential growth. But let me remake the point more clearly. The doubling time of economic growth has remained stable at around 15 years. The doubling time of computational processing speed has remained roughly stable at around 24 months.
I agree that economic growth in developed economies in the last 20 years has come largely from tech progress . But it has not had an effect on the rate of economic growth.
I would say that it’s only happened because of exponential technological progress; we never had that level of exponential growth until the industrial revolution.
Long term global growth is achieved only through tech progress. We didn’t have this rate of economic growth before the industrial revolution, that’s true. It wasn’t experienced during the agricultural phase. But foragers didn’t enjoy the same growth rate as farmers. The rate of economic growth has not increased since well before the introduction of computers.
I’m not sure that I see a strong demand-side argument here against exponential growth.
Even if demand for one specific technology falls (for example, people care less about having hard drive space), then you would expect demand for other technological advancements to increase; after all, demand is basically limited by consumer wealth, and in the long run if consumers are spending less money on something, they’ll be spending more money on something else, which will encourage technological improvements in that field.
Overall, I think the exponential growth story on a socital level is larger then the computers/ Moore’s law type of exponential growth you’re focusing on here. What I would say is that exponential growth comes from something like this:
Assumption 1. As technology improves and science advances, and as existing technologies become more widely deployed and cheaper and more uses are found for them, the total resources avalable to society as a whole increase.
Assumption 2. Society will continue to invest a certain percentage of it’s resources into advances in science and technology,
It’s basically the ‘compound interest’ model of exponential growth; more investment capital we have as a species as a whole means more investment payoff, which then gives us more investment capital.
“Resources” is a vague term, but it seems true to me in a number of senses. Better technology means better scientific instruments, which means faster advances in science. More advances in science means more options opening up for technological progress. More technology also means a larger economy, with more industrial production, more efficient food production, more overall resources. More resources can be devoted to education, which increases the mental resources of the human race. More access to information, ect.
Basically, even if demand for consumer computer technology slows down, then I would just expect that capital to go into other kinds of research (industrial computer technology, or biotech, or energy research, or pure science, ect), which would then increase the total capacity of the human race, and accelerate overall research. And then, even if the demand for improved computer technology doesn’t justify spending X money right now, that same research will be both easier and cheaper to do, and require a much smaller relative use of resources, in the future. So even if computer technology slows down for the short term, I would expect overall exponential growth to continue to accelerate, until it drags computer technology along with it.
I think that’s a common phenomenon; right now, we as a society are investing a huge amount of our resources into computer technology, and as a side effect that’s dragging along technology improvement in dozens of other areas, everything from biology to automobiles. If we shift our focus, then overall advancement should continue, and likely would drag along technology advancement in everything, including computers. (Perhaps biological research or research into materials science suggests better ways of constructing computers, for example).
I do think economic growth will continue to be exponential over short time horizons, though the exponent itself might change over time (it’s unclear whether the change will be in the positive or negative direction). My focus here was on specific technologies whose continued exponential growth for the next 30 years or so is used as an argument for the imminence of a technological singularity.
As we shift from one paradigm of advancement to another, we may still have exponential growth, but the exponent for the new exponential growth paradigm may be quite different.
Fair enough. In that case, though, I think you then have to consider the possibilities that other forms of technological development might themselves lead to a singularity of a different type (biotech, for example, seems quite possible), or might at least lower the barrier and make it easier for someone to improve computer technology with fewer resources, making it profitable for people to continue to improve computers even with a lower payoff in forms of consumer demand.
That is; if there’s only X level of consumer demand for “better computers” by whatever definition you want to use, that might not be enough to fund enough research to accomplish that right now, but in an exponentially growing economy with exponentially growing technology and resources, it should cost far less to make that advance in a few years.
So long as the whole economy and the whole mass of human science and technology continues to grow exponentially, I would expect computers to continue to improve exponentially; they may become a “lagging indicator” of progress instead of the cutting edge if other areas get a larger fraction of the research capital investment, but even that should be enough to maintain some kind of exponential curve.
Yes, this is a plausible scenario. I personally put weight on this type of scenario, namely, that progress might stall and then resume once some complementary supply-side and demand-side innovations have been made and other economic progress has happened to support more investment in the area. I don’t think this would be runaway technological progress. I might talk more about this sort of scenario in a future post.
No reason to think it won’t be runaway technological progress, depending on how you define runaway. The industrial revolution was runaway technological progress. Going from an economic output doubling time of 1000 years to 15 years is certainly runway. The rate of growth ultimately stalled but it was certainly runaway for that transitional period, even though there were stalls along the way.
Edited to add link.
If you haven’t already seen a version of this talk by Robin Hanson, the first 20 minutes or so goes into this but it’s interesting throughout if you have time. http://www.youtube.com/watch?v=uZ4Qx42WQHo
No reason? How about humans?
Um. Runaway progress does not stall by defintion—think about what “runaway” means.
So we’re talking about a human based runaway scenario? That’s not gonna happen.
OK, that’s what ‘runaway’ growth means. Can this even be predicted. I think not. How could you possibly ever know that you’re in a runaway? The transition from agriculture to industry saw an increase in economic growth roughly 65 times faster. I think if we saw global output accelerate by even half that in the next 20 years most would be calling a runaway scenario.
We are talking about a runaway scenario in a human civilization, aren’t we?
So what does it mean?
I don’t think that’s possible. Do you? A runaway means a massive and ongoing boost in productivity. That seems achievable only by AI, full brain emulations, or transhumans that are much smarter and faster at doing stuff than humans can be.
I was agreeing (mostly). My point was that by that definition we could never predict, or even know that we are in the middle of, a runway scenario. I did pose it as a question and you did not reply with an answer. So what do you think? If the doubling time in economic output decreased by 35 times over the next 2, or even 4 decades, would you think we are in a runaway scenario?
-
Situation in quote 1 will not happen within the time frame in quote 2.
Generally speaking, I understand “runaway” as “unstoppable”, meaning both that it won’t stop on its own (stall) and that we lost control over it.
And, by the way, understanding that you lost control is how you know you’re in a runaway scenario.
I did not mean to imply that situation in quote 1 would happen within the timeframe of quote 2, and I don’t think i did. It’s a thought experiment and I think that is clear.
There are examples of this in real history from smart people who thought we’d lost control—see Samuel Butler. We have, arguably. The extent to which machines are now integral to continued economic prosperity is irreversible without unbearable costs (people will die).
I am confused. What is a thought experiment?
My impression is that you are now evading questions and being deliberately provocative; but I’ll play...
If the rate economic growth were to increase by x35, would you think you were in a runaway scenario?
http://en.wikipedia.org/wiki/Thought_experiment
When I’m being deliberately provocative, it’s… more noticeable :-D I also know what a thought experiment is.
What I was confused about is exactly which part of the whole discussion about exponential growth did you consider to be a thought experiment.
If that were the only piece of information that I had, no, I would not think so. Insufficient data.
Yeah, so do I.
I’m not sure it makes a lot of difference in terms of long run predictions, though. Let’s say that for the next 10 years, we cut the amount of research we are doing into computers in half in percentage terms (so instead of putting X% of our global GDP into computer research every year, we put X/2%.) Let’s say we take that and instead invest it in other forms of growth (other technologies, biotech, transhuman technologies, science, infrastructure, or even bringing the third world out of poverty and into education, ect) and maintain the current rate of global growth. Let’s further say that the combination of global GDP growth and science and technology growth is roughly 7% a year, so that the global economy is doubled every 10 years in how much it can devote to research. And then at the end of that period, computer research goes back up to X%.
In that case, that 10 year long research slowdown would put us getting to where we “should” have been in computer science in 2044 now happening in 2045 instead; if that’s the point we need to be at to get a singularity started, then that 10 years long research slowdown would only delay the singularity by about 1.75 years. (edit: math error corrected)
And not only that, after a 10 year slowdown into computer science research, I would expect computers to become the new “low hanging fruit”, and we might end up devoting even more resources to it at that point, perhaps eliminating the time loss all together.
Basically, so long as exponential growth continues at all, in technological and economic terms in general, I don’t think the kind of slowdown we’re talking about would have a huge long-term effect on the general trajectory of progress.
As a general observation, you don’t want to model growth (of any sort) as X% per year. You want to model it as a random variable with the mean of X% per year, maybe, and you want to spend some time thinking about its distribution. In particular, whether that distribution is symmetric and how far out do the tails go.
Won’t the rate of economic growth be different (much larger) by definition? I can’t envisage a scenario where economic growth could be roughly as it is now or slower but we have experienced anything even approaching a technological singularity. Think of the change in growth rates resulting from the farming and industrial revolutions.
Something to puzzle over is the fact we have seen computational grunt grow exponentially for decade upon decade yet economic growth has been stable over the same period.
Depends on the reason for the switch to a new paradigm. If the reason is that there are even more attractive options, then economic growth would accelerate. If the reasons are that we’re running out of demand for improvement across the board, and people are more satisfied with their lives, and the technological low-hanging fruit are taken, then economic growth could be lower.
I see. The demand side story. I suppose it is technically feasible but I find it unlikely in the extreme. There is nothing in history to suggest it and I don’t think it fits with psychology. History is full of examples of how we won’t want for anything after we have ‘some foreseen progress’. We’ve had the luxury of being able to trade in some economic growth for more leisure, and still be better off than our grandparents, for a long time now, but haven’t.
Do you mean lower than it is now? After a paradigm shift in advancement?
Economic growth itself is an exponential function. “The economy grows 3% every year” is exponential growth, not linear growth. I would say that it’s only happened because of exponential technological progress; we never had that level of exponential growth until the industrial revolution. And I would say that most of the economic growth the first world has had over the past 20 years has come from recent technological advancement, mostly being the twin communication and computer revolutions we’ve had (PC’s, cell phones, internet, smart phones, and some smaller examples of both).
I’m not sure how you got from my comments that I don’t understand exponential growth. But let me remake the point more clearly. The doubling time of economic growth has remained stable at around 15 years. The doubling time of computational processing speed has remained roughly stable at around 24 months. I agree that economic growth in developed economies in the last 20 years has come largely from tech progress . But it has not had an effect on the rate of economic growth.
Long term global growth is achieved only through tech progress. We didn’t have this rate of economic growth before the industrial revolution, that’s true. It wasn’t experienced during the agricultural phase. But foragers didn’t enjoy the same growth rate as farmers. The rate of economic growth has not increased since well before the introduction of computers.