Hollerith, you are now officially as weird as a Yudkowskian alien. If I ever write this species I’ll name it after you.
Eliezer, to which of the following possibilities would you accord significant probability mass? (1) Richard Hollerith would change his stated preferences if he knew more and thought faster, for all reasonable meanings of “knew more and thought faster”; (2) There’s a reasonable notion of extrapolation under which all normal humans would agree with a goal in the vicinity of Richard Hollerith’s stated goal; (3) There exist relatively normal (non-terribly-mutated) current humans A and B, and reasonable notions of extrapolation X and Y, such that “A’s preferences under extrapolation-notion X” and “B’s preferences under extrapolation-notion Y” differ as radically as your and Richard Holleriths preferences appear to diverge.
Hollerith, you are now officially as weird as a Yudkowskian alien. If I ever write this species I’ll name it after you.
Eliezer, to which of the following possibilities would you accord significant probability mass? (1) Richard Hollerith would change his stated preferences if he knew more and thought faster, for all reasonable meanings of “knew more and thought faster”; (2) There’s a reasonable notion of extrapolation under which all normal humans would agree with a goal in the vicinity of Richard Hollerith’s stated goal; (3) There exist relatively normal (non-terribly-mutated) current humans A and B, and reasonable notions of extrapolation X and Y, such that “A’s preferences under extrapolation-notion X” and “B’s preferences under extrapolation-notion Y” differ as radically as your and Richard Holleriths preferences appear to diverge.