The difference between “slightly above human” and “very high level of superintelligence” is difficult to grasp, because we don’t have a good way to quantify intelligence and don’t have a good way to predict how much intelligence you need to achieve something. That said, some plausible candidates (in addition to the two you mentioned, which are reasonable) are:
Solving all other X-risks
Constructing a Dyson sphere or something else that will allow much more efficient and massive conversion of physical resources to human flourishing
Solving all problems of society/government/economics, except to the extent we want to solve them ourselves
Creating a way of life for everyone which is neither oppressive (like having to work in a boring and/or unpleasant job) nor dull or meaningless
Finding the optimal way to avert a Malthusian catastrophe while satisfying the human preferences for reproduction and immortality
Allowing us to modify/improve the minds of ourselves and our descendants, and/or create entirely new kinds of minds, while protecting us from losing our values and identities, or unintentionally triggering a moral catastrophe
Solving all moral conundrums involving animals, wild nature and other non-human minds, if such exist
Negotiating with aliens, if such exist (but that is probably very non-urgent)
Regarding near-light-speed space travel (and space colonization), it does seem necessary if you want to make the best use of the universe.
Also, I think Gurkenglas has a very good point regarding acausal trade.
The difference between “slightly above human” and “very high level of superintelligence” is difficult to grasp, because we don’t have a good way to quantify intelligence and don’t have a good way to predict how much intelligence you need to achieve something. That said, some plausible candidates (in addition to the two you mentioned, which are reasonable) are:
Solving all other X-risks
Constructing a Dyson sphere or something else that will allow much more efficient and massive conversion of physical resources to human flourishing
Solving all problems of society/government/economics, except to the extent we want to solve them ourselves
Creating a way of life for everyone which is neither oppressive (like having to work in a boring and/or unpleasant job) nor dull or meaningless
Finding the optimal way to avert a Malthusian catastrophe while satisfying the human preferences for reproduction and immortality
Allowing us to modify/improve the minds of ourselves and our descendants, and/or create entirely new kinds of minds, while protecting us from losing our values and identities, or unintentionally triggering a moral catastrophe
Solving all moral conundrums involving animals, wild nature and other non-human minds, if such exist
Negotiating with aliens, if such exist (but that is probably very non-urgent)
Regarding near-light-speed space travel (and space colonization), it does seem necessary if you want to make the best use of the universe.
Also, I think Gurkenglas has a very good point regarding acausal trade.