I think the best bets as of today would be truly cheap energy (whether through fusion, ubiqutious solar, etc) and nano-fabrication. Though it may not happen, we could see these play out in 20-30 year term.
The bumps from this, however would be akin to the steam engine. Dwarfed by (or possibly a result of) the AI.
The steam engine heralded the Industrial Revolution and a lasting large increase in doubling rate. I would expect a rapid economic growth after either of these inventions, followed by returning to the existing doubling rate.
After achieving a society of real abundance, further economic growth will have lost incentive.
We can argue whether or not such a society is truly reachable, even if only in the material sense. If not, because of human intractability or AGI inscrutability, progress may continue onward and upward. Perhaps here, as in happiness, it’s the pursuit that counts.
Would you count uploads (even if we don’t understand the software) as a kind of AI? If not, those would certainly work.
Otherwise, there are still things one could do with human brains. Better brain-computer-interfaces would be helpful, and some fairly mild increases in genome understanding could allow us to massively increase the proportion of people functioning at human genius level.
For fastest economic growth it is not necessary to achieve human-level intelligence. It is even hindering. Highly complex social behaviour to find a reproduction partner is not neccessary for economic success. A totally unbalanced AI character with highly superhuman skills in creativity, programming, engineering and cheating humans could beat a more balanced AI character and self-improve faster.
Todays semantic big data search is already manitudes faster than human research in a library using a paper catalog. We have to state highly super-human performance to answer questions and low sub-human performance in asking questions. Strong AI is so complex that projects for normal business time frames go for the low hanging fruits. If the outcome of such a project can be called an AI it is with highest probability extremely imbalanced in its performance and character.
Nanotech is the next big thing because you will have lots of self replicating tiny machines who can quickly work as a group as a kind of hive mind. That’s important.
Are there foreseeable developments other than human-level AI which might produce much faster economic growth? (p2)
I think the best bets as of today would be truly cheap energy (whether through fusion, ubiqutious solar, etc) and nano-fabrication. Though it may not happen, we could see these play out in 20-30 year term.
The bumps from this, however would be akin to the steam engine. Dwarfed by (or possibly a result of) the AI.
The steam engine heralded the Industrial Revolution and a lasting large increase in doubling rate. I would expect a rapid economic growth after either of these inventions, followed by returning to the existing doubling rate.
After achieving a society of real abundance, further economic growth will have lost incentive.
We can argue whether or not such a society is truly reachable, even if only in the material sense. If not, because of human intractability or AGI inscrutability, progress may continue onward and upward. Perhaps here, as in happiness, it’s the pursuit that counts.
Why do you expect a return to the existing doubling rate in these cases?
Would you count uploads (even if we don’t understand the software) as a kind of AI? If not, those would certainly work.
Otherwise, there are still things one could do with human brains. Better brain-computer-interfaces would be helpful, and some fairly mild increases in genome understanding could allow us to massively increase the proportion of people functioning at human genius level.
For fastest economic growth it is not necessary to achieve human-level intelligence. It is even hindering. Highly complex social behaviour to find a reproduction partner is not neccessary for economic success. A totally unbalanced AI character with highly superhuman skills in creativity, programming, engineering and cheating humans could beat a more balanced AI character and self-improve faster. Todays semantic big data search is already manitudes faster than human research in a library using a paper catalog. We have to state highly super-human performance to answer questions and low sub-human performance in asking questions. Strong AI is so complex that projects for normal business time frames go for the low hanging fruits. If the outcome of such a project can be called an AI it is with highest probability extremely imbalanced in its performance and character.
Nanotech is the next big thing because you will have lots of self replicating tiny machines who can quickly work as a group as a kind of hive mind. That’s important.