This all makes sense of the purpose of life is to solve problems. It’s not. Being rational means maximizing your own goals. Usually people care more about some sort of happiness than solving the maximum number of problems. Spendingost of your time thinking about problems that you probably can’t solve anyway tends to make people unhappy. So it’s irrational by the goals of humans, even though it’s roughly rational by the goals of evolution.
Agreed that people have lots of goals that don’t fit in this model. It’s definitely a simplified model. But I’d argue that ONE of (most) people’s goals to solve problems; and I do think, broadly speaking, it is an important function (evolutionarily and currently) for conversation. So I still think this model gets at an interesting dynamic.
This all makes sense of the purpose of life is to solve problems. It’s not. Being rational means maximizing your own goals. Usually people care more about some sort of happiness than solving the maximum number of problems. Spendingost of your time thinking about problems that you probably can’t solve anyway tends to make people unhappy. So it’s irrational by the goals of humans, even though it’s roughly rational by the goals of evolution.
Agreed that people have lots of goals that don’t fit in this model. It’s definitely a simplified model. But I’d argue that ONE of (most) people’s goals to solve problems; and I do think, broadly speaking, it is an important function (evolutionarily and currently) for conversation. So I still think this model gets at an interesting dynamic.