I’m not compelled by that analogy. There are lots of things that money can’t buy, but that (sufficient) intelligence can.
There are theoretical limits to what cognition is able to do, but those are so far from the human range that they’re not really worth mentioning. The question is: “are there practical limits to what an intelligence can do, that leave even a super-intelligence uncommunicative with human civilization?”
It seems to me that as an example, you could just take a particularly impressive person (Elon Musk or John Von Neuman are popular exemplars) and ask “What if there was a nation of only people who were that capable?” It seems that if a nation of say 300,000,000 Elon Musks went to war with the United States, the United States would loose handily. Musktopia would just have a huge military-technological advantage: they would do fundamental science faster, and develop engineering innovations faster, and have better operational competence than the US, on ~ all levels. (I think this is true for a much smaller number than 300,000,000, having a number that high makes the point straightforward.)
Does that seem right to you? If not, why not?
Or alternatively, what do you make of vignettes like That Alien Message?
I don’t think a nation of Musks would win against current USA because Musk is optimised for some things (making an absurd amount of money, CEOing, twitting his shower thoughts), but an actual war requires a rather more diverse set of capacity.
Similarly, I don’t think an AGI would necessarily win a war of extermination against us, because currently (emphasize currently) it would need us to run it’s infrastructure. This would change in a world were all industrial tasks could be carried away without physical imput from humans, but we are not there yet and will not be soon.
I’m not compelled by that analogy. There are lots of things that money can’t buy, but that (sufficient) intelligence can.
There are theoretical limits to what cognition is able to do, but those are so far from the human range that they’re not really worth mentioning. The question is: “are there practical limits to what an intelligence can do, that leave even a super-intelligence uncommunicative with human civilization?”
It seems to me that as an example, you could just take a particularly impressive person (Elon Musk or John Von Neuman are popular exemplars) and ask “What if there was a nation of only people who were that capable?” It seems that if a nation of say 300,000,000 Elon Musks went to war with the United States, the United States would loose handily. Musktopia would just have a huge military-technological advantage: they would do fundamental science faster, and develop engineering innovations faster, and have better operational competence than the US, on ~ all levels. (I think this is true for a much smaller number than 300,000,000, having a number that high makes the point straightforward.)
Does that seem right to you? If not, why not?
Or alternatively, what do you make of vignettes like That Alien Message?
I don’t think a nation of Musks would win against current USA because Musk is optimised for some things (making an absurd amount of money, CEOing, twitting his shower thoughts), but an actual war requires a rather more diverse set of capacity.
Similarly, I don’t think an AGI would necessarily win a war of extermination against us, because currently (emphasize currently) it would need us to run it’s infrastructure. This would change in a world were all industrial tasks could be carried away without physical imput from humans, but we are not there yet and will not be soon.