Still, this is not a truth-maximizing website
I mean, I agree with this, but popularity has a better correlation with truth here compared with any other website—or more broadly, social group—that I know of. And actually, I think it’s probably not possible for a relatively open venue like this to be perfectly truth-seeking. To go further in that direction, I think you ultimately need some sort of institutional design to explicitly reward accuracy, like prediction markets. But the ways in which LW differs from pure truth-and-importance-seeking don’t strike me as entirely bad things either—posts which are inspiring or funny get upvoted more, for instance. I think it would be difficult to nucleate a community focused on truth-seeking without “emotional energy” of this sort.
Because there might be some other programs with a lot of computational resources which scan through simple universes looking for programs to run?
This one doesn’t feel so unintuitive to me since “easy to point at” is somewhat like “has a lot of copies” so it’s kinda similar to the biological desire to reproduce(though I assume you’re alluding to another motivation for becoming easy to point at?)