In one of my imaginable-ideal-barely-possible worlds, Eliezer’s current choice of “thing to point your seed AI at and say ‘that’s where you’ll find morality content’” was tentatively determined to be what it currently nominally is (instead of tempting alternatives like “the thing that makes you think that your proposed initial dynamic is the best one” or “the thing that causes you to care about doing things like perfecting things like the choice of initial dynamic” or something) after he did a year straight of meditation on something like the lines of reasoning I suggest above, except honed to something like perfection-given-boundedness (e.g. something like the best you could reasonably expect to get at poker given that most of your energy has to be put into retaining your top 5 FIDE chess rating while writing a bestselling book popular science book).
In one of my imaginable-ideal-barely-possible worlds, Eliezer’s current choice of “thing to point your seed AI at and say ‘that’s where you’ll find morality content’” was tentatively determined to be what it currently nominally is (instead of tempting alternatives like “the thing that makes you think that your proposed initial dynamic is the best one” or “the thing that causes you to care about doing things like perfecting things like the choice of initial dynamic” or something) after he did a year straight of meditation on something like the lines of reasoning I suggest above, except honed to something like perfection-given-boundedness (e.g. something like the best you could reasonably expect to get at poker given that most of your energy has to be put into retaining your top 5 FIDE chess rating while writing a bestselling book popular science book).