Something with a utility function, if it values an apple 1% more than an orange, if offered a million apple-or-orange choices, will choose a million apples and zero oranges. The division within most people into selfish and unselfish components is not like that, you cannot feed it all with unselfish choices whatever the ratio. Not unless you are a Keeper, maybe, who has made yourself sharper and more coherent; or maybe not even then, who knows?
I fear that this parable encourages a view whereby the utility function “should” factorize over intuitively obvious discrete quantities (e.g. apples and oranges). My utility function can value having a mixture of both apples and oranges.