My counter thought experiment to CEV is to consider our distant ancestors. I mean so far distant that we wouldn’t call them human, maybe even as far back as some sort of fish-like creature. Suppose a super AI somehow offered this fish the chance to rapidly “advance”, following its CEV and it showed it a vision of the future, us, and asked the fishy thing whether to go ahead. Do you think the fishy thing would say yes? Similarly, if an AI offered to evolve humankind, in 50 years, into telepathic little green men that it assured us was the result of our CEV, would we not instantly shut it down in horror? My personal preference, I like to call the GFP—Glorious Five-year Plan: You have the AI offer a range of options for 5 (or 50 but definitely no longer) years in the future, and we pick one. And in 5 years time we repeat the process. The bottom line is that humans do not want rapid change. Just we are happier with 2% inflation than 0% or 100%, we want a moderate rate of change. At its heart there is a “Ship of Theseus” problem. If the AI replaces every part of the ship overnight so that in the morning we find the QE2 at dock then it is not the ship of Theseus.
My counter thought experiment to CEV is to consider our distant ancestors. I mean so far distant that we wouldn’t call them human, maybe even as far back as some sort of fish-like creature. Suppose a super AI somehow offered this fish the chance to rapidly “advance”, following its CEV and it showed it a vision of the future, us, and asked the fishy thing whether to go ahead. Do you think the fishy thing would say yes?
Similarly, if an AI offered to evolve humankind, in 50 years, into telepathic little green men that it assured us was the result of our CEV, would we not instantly shut it down in horror?
My personal preference, I like to call the GFP—Glorious Five-year Plan: You have the AI offer a range of options for 5 (or 50 but definitely no longer) years in the future, and we pick one. And in 5 years time we repeat the process. The bottom line is that humans do not want rapid change. Just we are happier with 2% inflation than 0% or 100%, we want a moderate rate of change.
At its heart there is a “Ship of Theseus” problem. If the AI replaces every part of the ship overnight so that in the morning we find the QE2 at dock then it is not the ship of Theseus.