So the story isn’t planned to end with Harry creating a godlike super entity operating outside of time to ensure eternal life for all thinking beings, to download of all consciousnesses throughout history at their point of death, to lay the foundations of magic, and in that way to all at once resolve questions regarding Atlantis, the afterlife, and how to write the happiest, most clearly drunk on wish fulfillment and indulgent to the ‘bargaining’ stage of grief/acceptance-est ending possible?
Surely not, if the story is nearer its end than its beginning, given its pacing so far. Given Eliezer’s beliefs about FAI, and that the story is not supposed to lie, Harry attempting to create a godlike AI without years of careful research should result in a Bad End.
I admit I had not considered that when making my ridiculous proposal.
However, EY has suggested that a good and bad end are already written. The bad end of 3 Worlds was ‘happily ever after,’ so my ridiculous proposal is made no less valid by not meeting your entirely reasonable criteria for a good end, based on teaching a poor moral.
No intention?
So the story isn’t planned to end with Harry creating a godlike super entity operating outside of time to ensure eternal life for all thinking beings, to download of all consciousnesses throughout history at their point of death, to lay the foundations of magic, and in that way to all at once resolve questions regarding Atlantis, the afterlife, and how to write the happiest, most clearly drunk on wish fulfillment and indulgent to the ‘bargaining’ stage of grief/acceptance-est ending possible?
Surely not, if the story is nearer its end than its beginning, given its pacing so far. Given Eliezer’s beliefs about FAI, and that the story is not supposed to lie, Harry attempting to create a godlike AI without years of careful research should result in a Bad End.
I admit I had not considered that when making my ridiculous proposal.
However, EY has suggested that a good and bad end are already written. The bad end of 3 Worlds was ‘happily ever after,’ so my ridiculous proposal is made no less valid by not meeting your entirely reasonable criteria for a good end, based on teaching a poor moral.
The Atlantis thing was proposed in a chapter titled “Hold Off On Proposing Solutions”.
That doesn’t mean, “Don’t Propose a Solution.”
Seems kind of like rehashing old ground covered by Spider Robinson, to me.