Why did you pick caring about each other as a thing culture + evolution was trying to do?
You’re not alone in making what I think is the same mistake. I think it’s actually quite common to feel like it’s amazing that evolution managed to come up with us humans who like beauty and friendship and the sanctity of human life and so on—evolution and culture must have been doing something right, to come up with such great ideas.
But in the end, no; evolution is impressive but not in that way. You picked caring about each other as the target because humans value it—a straightforward case of painting the target around the bullet-hole.
I don’t find it amazing or something. It’s more like… I dont know how to write the pseudocode for an AI that actually cares about human welfare. In my mind that is pretty close to something that tries to be aligned. But if even evolution managed to create agents capable of this by accident, then it might not be that hard.
I initially had a Paragraph explaining my motivation in the question, but then removed it in favor of brevity. Kind of regretting this now because people seem to read into this that I think of evolution as some spirit or something.
Why did you pick caring about each other as a thing culture + evolution was trying to do?
You’re not alone in making what I think is the same mistake. I think it’s actually quite common to feel like it’s amazing that evolution managed to come up with us humans who like beauty and friendship and the sanctity of human life and so on—evolution and culture must have been doing something right, to come up with such great ideas.
But in the end, no; evolution is impressive but not in that way. You picked caring about each other as the target because humans value it—a straightforward case of painting the target around the bullet-hole.
I don’t find it amazing or something. It’s more like… I dont know how to write the pseudocode for an AI that actually cares about human welfare. In my mind that is pretty close to something that tries to be aligned. But if even evolution managed to create agents capable of this by accident, then it might not be that hard.
But, like, evolution made a bunch of other agents that didn’t have these properties.
Yes. The right word would probably have been possible not easy in my title. Easy is quite subjective.
I initially had a Paragraph explaining my motivation in the question, but then removed it in favor of brevity. Kind of regretting this now because people seem to read into this that I think of evolution as some spirit or something.