I think the expectation is that, if all humans had the same knowledge and were better at thinking (and were more the people we’d like to be, etc.), then there would be a much higher degree of coherence than we might expect, but not necessarily that everyone would ultimately have the same utility function.
I think the expectation is that, if all humans had the same knowledge and were better at thinking (and were more the people we’d like to be, etc.), then there would be a much higher degree of coherence than we might expect, but not necessarily that everyone would ultimately have the same utility function.