Noted. The problem remains—it’s just less obvious. This phrasing still conflates “intelligent system” with “optimizer”, a mistake that goes all the way back to Eliezer Yudkowsky’s 2004 paper on Coherent Extrapolated Volition.
For example, consider a computer system that, given a number N can (usually) produce the shortest computer program that will output N. Such a computer system is undeniably superintelligent, but it’s not a world optimizer at all.
“Far away, in the Levant, there are yogis who sit on lotus thrones. They do nothing, for which they are revered as gods,” said Socrates.
Noted. The problem remains—it’s just less obvious. This phrasing still conflates “intelligent system” with “optimizer”, a mistake that goes all the way back to Eliezer Yudkowsky’s 2004 paper on Coherent Extrapolated Volition.
For example, consider a computer system that, given a number N can (usually) produce the shortest computer program that will output N. Such a computer system is undeniably superintelligent, but it’s not a world optimizer at all.