Bostrum underestimates complexity of learning, compare Robin Hanson’s criticism “I Still Don’t Get Foom” on his book.
Assume following small team scenario that could reach a decisive advantage: A hedge fond seeks world dominion and develops in secrecy a self-improving AI. Following skills shall reach superhuman capabilities:
cyberattack and cryptanalysis
semantic comprehension of tech and business documents
trading strategy
Latest when this AI reaches a decisive strategic advantage over other market players they will acknowledge this instantly. Insider trade investigations will soon reveal that this hedge fond was breaking the law.
This AI had not yet the time to learn all other skills needed for world dominion:
creating a globally operating robot guard
creativity to develop countermeasures against all thinkable switching-off scenarios
influencing humans (politicians, military brass and mass propaganda).
A human is capable to run a large company or a government earliest at the age of 30. To learn how to influence people has very few speedup options and books are of little help. The AI has to acquire real insight comprehension of human values and motivations to become a successful world leader.
A quick-and-dirty AI without carefully designed utility function and wisdom from humane values will evoke the utmost available power of the entire world united to switch off this AI.
Yes indeed I am convinced that 30 years of learning is a minimum for running a
large company or a government. I compiled data from 155 government leaders of five countries.
On the average they took office for their first term at the age of 54.3 years.
For my above statement allow me to substract 2 standard deviations (2 x 8.5 = 19 years).
A government leader is therefore with 97.7% probability older than 35.3 years when he takes office for the first time.
The probability of a government leader being younger than 30 years is 0.22%, calculated from the standard distribution.
William Pitt the Younger became youngest UK Prime Minister in 1783 at the age of 24. He is the only
one younger than 30 in my data set.
Lumifer, could you be so kind to compile a similar statistical evaluation of about top 150 companies of these five countries. I can help you with DAX 30. I am expecting that the average will be lower for following reasons:
Trustbuilding is more time consuming in politics than in business and involves more people.
Some startup pioneers started very young and their companies grew extremely quick.
Right tail of age distribution >65 years will be thin for companies.
After this we both know who of us might LOL. We both know that a doom scenario is a possible outcome of AI development. My intention is to understand what knowledge is needed to rule the world and how it is possible to hide this content to slow down the learning curve of an AI capable of taking over everything.
Bostrum underestimates complexity of learning, compare Robin Hanson’s criticism “I Still Don’t Get Foom” on his book.
Assume following small team scenario that could reach a decisive advantage: A hedge fond seeks world dominion and develops in secrecy a self-improving AI. Following skills shall reach superhuman capabilities:
cyberattack and cryptanalysis
semantic comprehension of tech and business documents
trading strategy
Latest when this AI reaches a decisive strategic advantage over other market players they will acknowledge this instantly. Insider trade investigations will soon reveal that this hedge fond was breaking the law.
This AI had not yet the time to learn all other skills needed for world dominion:
creating a globally operating robot guard
creativity to develop countermeasures against all thinkable switching-off scenarios
influencing humans (politicians, military brass and mass propaganda).
A human is capable to run a large company or a government earliest at the age of 30. To learn how to influence people has very few speedup options and books are of little help. The AI has to acquire real insight comprehension of human values and motivations to become a successful world leader.
A quick-and-dirty AI without carefully designed utility function and wisdom from humane values will evoke the utmost available power of the entire world united to switch off this AI.
LOL. Do you really think so?
Yes indeed I am convinced that 30 years of learning is a minimum for running a large company or a government. I compiled data from 155 government leaders of five countries. On the average they took office for their first term at the age of 54.3 years.
For my above statement allow me to substract 2 standard deviations (2 x 8.5 = 19 years). A government leader is therefore with 97.7% probability older than 35.3 years when he takes office for the first time. The probability of a government leader being younger than 30 years is 0.22%, calculated from the standard distribution. William Pitt the Younger became youngest UK Prime Minister in 1783 at the age of 24. He is the only one younger than 30 in my data set.
Lumifer, could you be so kind to compile a similar statistical evaluation of about top 150 companies of these five countries. I can help you with DAX 30. I am expecting that the average will be lower for following reasons:
Trustbuilding is more time consuming in politics than in business and involves more people.
Some startup pioneers started very young and their companies grew extremely quick.
Right tail of age distribution >65 years will be thin for companies.
After this we both know who of us might LOL. We both know that a doom scenario is a possible outcome of AI development. My intention is to understand what knowledge is needed to rule the world and how it is possible to hide this content to slow down the learning curve of an AI capable of taking over everything.
Your data shows what typically happens. That’s a bit different from what “a human is capable” of.
Technically speaking, a single counter-example overturns your claim.