Anecdotally I don’t see much correlation between goals and intelligence in humans.
Some caveats:
There are goals only intelligent people are likely to have because they’re simply not reachable (or even not understandable) for unintelligent people, such as solving Fermat’s last theorem.
There are intermediary goals only unintelligent people will have because they don’t realise they won’t achieve their aims. I.e. an intelligent person is less likely to suggest a utopia where everyone does everything for free, because they realise its unlikely to work.
Anecdotally I don’t see much correlation between goals and intelligence in humans.
Some caveats:
There are goals only intelligent people are likely to have because they’re simply not reachable (or even not understandable) for unintelligent people, such as solving Fermat’s last theorem.
There are intermediary goals only unintelligent people will have because they don’t realise they won’t achieve their aims. I.e. an intelligent person is less likely to suggest a utopia where everyone does everything for free, because they realise its unlikely to work.
Your second caveat is the point I’m making.