@Lara_Foster: You see, it seems quite likely to me that humans evaluate utility in such a circular way under many circumstances, and therefore aren’t performing any optimizations.
Eliezer touches on that issue in “Optimization and the Singularity”:
Natural selection prefers more efficient replicators. Human intelligences have more complex preferences. Neither evolution nor humans have consistent utility functions, so viewing them as “optimization processes” is understood to be an approximation.
By the way, Ask middle school girls to rank boyfriend preference and you find Billy beats Joey who beats Micky who beats Billy...
Would you mind peeking into your mind and explaining why that arises? :-) Is it just a special case of the phenomenon you described in the rest of your post?
@Lara_Foster: You see, it seems quite likely to me that humans evaluate utility in such a circular way under many circumstances, and therefore aren’t performing any optimizations.
Eliezer touches on that issue in “Optimization and the Singularity”:
By the way, Ask middle school girls to rank boyfriend preference and you find Billy beats Joey who beats Micky who beats Billy...
Would you mind peeking into your mind and explaining why that arises? :-) Is it just a special case of the phenomenon you described in the rest of your post?