Humans aren’t utility maximizers, but we think of ourselves as them
What makes you believe this? I wouldn’t assume that most people think that way. In order to maximize utility you first have to define a utility function. This is impossible for most of us. I find that I have a fuzzy list of wishes to satisfy, with unclear priorities that shift over time. I imagine that if a rational entity were to try to make sense of other entities that appear similar, it might make an assertion like yours. But what if it turns out that the rest of the entities have a much lower mix of rational / non-rational (“system 1” if you will) function during a given time period? It could be that other people are not attempting to maximize anything most of the time. Perhaps once in a while they sit down and reason about a particular goal, and most of the time they are delegating to more basic systems.
What makes you believe this? I wouldn’t assume that most people think that way. In order to maximize utility you first have to define a utility function. This is impossible for most of us. I find that I have a fuzzy list of wishes to satisfy, with unclear priorities that shift over time. I imagine that if a rational entity were to try to make sense of other entities that appear similar, it might make an assertion like yours. But what if it turns out that the rest of the entities have a much lower mix of rational / non-rational (“system 1” if you will) function during a given time period? It could be that other people are not attempting to maximize anything most of the time. Perhaps once in a while they sit down and reason about a particular goal, and most of the time they are delegating to more basic systems.