Surely not, if the story is nearer its end than its beginning, given its pacing so far. Given Eliezer’s beliefs about FAI, and that the story is not supposed to lie, Harry attempting to create a godlike AI without years of careful research should result in a Bad End.
I admit I had not considered that when making my ridiculous proposal.
However, EY has suggested that a good and bad end are already written. The bad end of 3 Worlds was ‘happily ever after,’ so my ridiculous proposal is made no less valid by not meeting your entirely reasonable criteria for a good end, based on teaching a poor moral.
Surely not, if the story is nearer its end than its beginning, given its pacing so far. Given Eliezer’s beliefs about FAI, and that the story is not supposed to lie, Harry attempting to create a godlike AI without years of careful research should result in a Bad End.
I admit I had not considered that when making my ridiculous proposal.
However, EY has suggested that a good and bad end are already written. The bad end of 3 Worlds was ‘happily ever after,’ so my ridiculous proposal is made no less valid by not meeting your entirely reasonable criteria for a good end, based on teaching a poor moral.