I’m worried that some of my concepts here are a little be shaky and confused in a way that I can’t articulate, but my provisional answer is: because their planet would have to be virtually a duplicate of Earth to get that kind of match. Suppose that my deepest heart’s desire, my lifework, is for me to write a grand romance novel about an actuary who lives in New York and her unusually tall boyfriend. That’s a necessary condition for my ideal universe: it has to contain me writing this beautiful, beautiful novel.
It doesn’t seem all that implausible that powerful aliens would have a goal of “be nice to all sentient creatures,” in which case they might very well help me with my goal in innumerable ways, perhaps by giving me a better word processor, or providing life extension so I can grow up to have a broader experience base with which to write. But I wouldn’t say that this is the same thing as the alien sharing my goals, because if humans had never evolved, it almost certainly wouldn’t have even occurred to the alien to create, from scratch, a human being who writes a grand romance novel about an actuary who lives in New York and her unusually tall boyfriend. A plausible alien is simply not going to spontaneously invent those concepts and put special value on them. Even if they have rough analogues to courtship story or even person who is rewarded for doing economic risk-management calculations, I guarantee you they’re not going to invent New York.
Even if the alien and I end up cooperating in real life, when I picture my ideal universe, and when they picture their ideal universe, they’re going to be different visions. The closest thing I can think of would be for the aliens to have evolved a sort of domain-general niceness, and to have a top-level goal for the universe to be filled with all sorts of diverse life with their own analogues of pleasure or goal-achievement or whatever, which me and my beautiful, beautiful novel would qualify as a special case of. Actually, I might agree with that as a good summary description of my top-level goal. The problem is, there are a lot of details that that summary description doesn’t pin down, which we would expect to differ. Even if the alien and I agree that the universe should blossom with diverse life, we would almost certainly have different rankings of which kinds of possible diverse life get included. If our future lightcone only has room for 10^200 observer-moments, and there are 10^4000 possible observer-moments, then some possible observer-moments won’t get to exist. I would want to ensure that me and my beautiful, beautiful novel get included, whereas the alien would have no advance reason to privilege me and my beautiful, beautiful novel over the quintillions of other possible beings with desires that they think of as their analogue of beautiful, beautiful.
This brings us to the apparent inevitability of something like cultural imperialism. Humans aren’t really optimizers—there doesn’t seem to be one unique human vision for what the universe should look like; there’s going to be room for multiple more-or-less reasonable construals of our volition. That being the case, why shouldn’t even benevolent aliens pick the construal that they like best?
Domain-general niceness works. It’s possible to be nice to and helpful to lots of different kinds of people with lots of different kinds of goals. Think Superhappies except with respect for autonomy.
I’m worried that some of my concepts here are a little be shaky and confused in a way that I can’t articulate, but my provisional answer is: because their planet would have to be virtually a duplicate of Earth to get that kind of match. Suppose that my deepest heart’s desire, my lifework, is for me to write a grand romance novel about an actuary who lives in New York and her unusually tall boyfriend. That’s a necessary condition for my ideal universe: it has to contain me writing this beautiful, beautiful novel.
It doesn’t seem all that implausible that powerful aliens would have a goal of “be nice to all sentient creatures,” in which case they might very well help me with my goal in innumerable ways, perhaps by giving me a better word processor, or providing life extension so I can grow up to have a broader experience base with which to write. But I wouldn’t say that this is the same thing as the alien sharing my goals, because if humans had never evolved, it almost certainly wouldn’t have even occurred to the alien to create, from scratch, a human being who writes a grand romance novel about an actuary who lives in New York and her unusually tall boyfriend. A plausible alien is simply not going to spontaneously invent those concepts and put special value on them. Even if they have rough analogues to courtship story or even person who is rewarded for doing economic risk-management calculations, I guarantee you they’re not going to invent New York.
Even if the alien and I end up cooperating in real life, when I picture my ideal universe, and when they picture their ideal universe, they’re going to be different visions. The closest thing I can think of would be for the aliens to have evolved a sort of domain-general niceness, and to have a top-level goal for the universe to be filled with all sorts of diverse life with their own analogues of pleasure or goal-achievement or whatever, which me and my beautiful, beautiful novel would qualify as a special case of. Actually, I might agree with that as a good summary description of my top-level goal. The problem is, there are a lot of details that that summary description doesn’t pin down, which we would expect to differ. Even if the alien and I agree that the universe should blossom with diverse life, we would almost certainly have different rankings of which kinds of possible diverse life get included. If our future lightcone only has room for 10^200 observer-moments, and there are 10^4000 possible observer-moments, then some possible observer-moments won’t get to exist. I would want to ensure that me and my beautiful, beautiful novel get included, whereas the alien would have no advance reason to privilege me and my beautiful, beautiful novel over the quintillions of other possible beings with desires that they think of as their analogue of beautiful, beautiful.
This brings us to the apparent inevitability of something like cultural imperialism. Humans aren’t really optimizers—there doesn’t seem to be one unique human vision for what the universe should look like; there’s going to be room for multiple more-or-less reasonable construals of our volition. That being the case, why shouldn’t even benevolent aliens pick the construal that they like best?
Domain-general niceness works. It’s possible to be nice to and helpful to lots of different kinds of people with lots of different kinds of goals. Think Superhappies except with respect for autonomy.