I had read it, had forgotten about it, hadn’t connected it with this story… but didn’t need to.
This story makes the goal clear enough. As I see it, eating the entire Universe to get the maximal number of mind-seconds[1]is expanding just to expand. It’s, well, gauche.
Really, truly, it’s not that I don’t understand the Grand Vision. It never has been that I didn’t understand the Grand Vision. It’s that I don’t like the Grand Vision.
It’s OK to be finite. It’s OK to not even be maximal. You’re not the property of some game theory theorem, and it’s OK to not have a utility function.
It’s also OK to die (which is good because it will happen). Doesn’t mean you have to do it at any particular time.
The problem with not expanding is that you can be pretty sure someone else will then grab what you didn’t and may use it for something that you hate. (Unless you trust that they’ll use it well.)
eating the entire Universe to get the maximal number of mind-seconds is expanding just to expand
It’s not “just to expand”. Expansion, at least in the story, is instrumental to whatever the content of these mind-seconds is.
I already have people planning to grab everything and use it for something that I hate, remember? Or at least for something fairly distasteful.
Anyway, if that were the problem, one could, in theory, go out and grab just enough to be able to shut down anybody who tried to actually maximize. Which gives us another armchair solution to the Fermi paradox: instead of grabby aliens, we’re dealing with tasteful aliens who’ve set traps to stop anybody who tries to go nuts expansion-wise.
It’s not “just to expand”. Expansion, at least in the story, is instrumental to whatever the content of these mind-seconds is.
Beyond a certain point, I doubt that the content of the additional minds will be interestingly novel. Then it’s just expanding to have more of the same thing that you already have, which is more or less identical from where I sit to expanding just to expand.
And I don’t feel bound to account for the “preferences” of nonexistent beings.
Beyond a certain point, I doubt that the content of the additional minds will be interestingly novel.
Somehow people keep finding meaning in failling in love and starting a family, even when billions of people have already done that before. We also find meaning in doing careers that are very similar to what million of people have done before or traveling to destination that has been visited by millions of turist. The more similar an activity is to something our ancestors did, the more meaningful it seems.
From the outside, all this looks grabby, but from the inside it feels meaningful.
You can choose or not choose to create more “minds”. If you create them, they will exist and have experiences. If you don’t create them, then they won’t exist and won’t have experiences.
That means that you’re free to not create them based on an “outside” view. You don’t have to think about the “inside” experiences of the minds you don’t create, because those experiences don’t and will never exist. That’s still true even on a timeless view; they never exist at any time or place. And it includes not having to worry about whether or not they would, if they existed, find anything meaningful[1].
If you do choose to create them, then of course you have to be concerned with their inner experiences. But those experiences only matter because they actually exist.
I truly don’t understand why people use that word in this context or exactly what it’s supposed to, um, mean. But pick pretty much any answer and it’s still true.
My point is that potential parents often care about non-existing people: their potential kids. And once they bring these potential kids into existence, those kids might start caring about a next generation.
Simularly, some people/minds will want to expand because that is what their company does, or they would like the experience of exploring a new planet/solar system/galaxy or would like the status of being the first to settle there.
If it’s OK to be not maximal, it will be reflected in The Grand Vision. But if we stay not maximal, it means that immeasurable amount of wonders is doing to not exist because of whatever limited vision you like. This is unfair.
I had read it, had forgotten about it, hadn’t connected it with this story… but didn’t need to.
This story makes the goal clear enough. As I see it, eating the entire Universe to get the maximal number of mind-seconds[1] is expanding just to expand. It’s, well, gauche.
Really, truly, it’s not that I don’t understand the Grand Vision. It never has been that I didn’t understand the Grand Vision. It’s that I don’t like the Grand Vision.
It’s OK to be finite. It’s OK to not even be maximal. You’re not the property of some game theory theorem, and it’s OK to not have a utility function.
It’s also OK to die (which is good because it will happen). Doesn’t mean you have to do it at any particular time.
Appropriately weighted if you like. And assuming you can define what counts as a “mind”.
I thought it was pretty courageous of you to state this so frankly here, especially given how the disagree-votes turned out.
The problem with not expanding is that you can be pretty sure someone else will then grab what you didn’t and may use it for something that you hate. (Unless you trust that they’ll use it well.)
It’s not “just to expand”. Expansion, at least in the story, is instrumental to whatever the content of these mind-seconds is.
I already have people planning to grab everything and use it for something that I hate, remember? Or at least for something fairly distasteful.
Anyway, if that were the problem, one could, in theory, go out and grab just enough to be able to shut down anybody who tried to actually maximize. Which gives us another armchair solution to the Fermi paradox: instead of grabby aliens, we’re dealing with tasteful aliens who’ve set traps to stop anybody who tries to go nuts expansion-wise.
Beyond a certain point, I doubt that the content of the additional minds will be interestingly novel. Then it’s just expanding to have more of the same thing that you already have, which is more or less identical from where I sit to expanding just to expand.
And I don’t feel bound to account for the “preferences” of nonexistent beings.
Somehow people keep finding meaning in failling in love and starting a family, even when billions of people have already done that before. We also find meaning in doing careers that are very similar to what million of people have done before or traveling to destination that has been visited by millions of turist. The more similar an activity is to something our ancestors did, the more meaningful it seems.
From the outside, all this looks grabby, but from the inside it feels meaningful.
… but a person who doesn’t exist doesn’t have an “inside”.
Which non-existing person are you refering to?
You can choose or not choose to create more “minds”. If you create them, they will exist and have experiences. If you don’t create them, then they won’t exist and won’t have experiences.
That means that you’re free to not create them based on an “outside” view. You don’t have to think about the “inside” experiences of the minds you don’t create, because those experiences don’t and will never exist. That’s still true even on a timeless view; they never exist at any time or place. And it includes not having to worry about whether or not they would, if they existed, find anything meaningful[1].
If you do choose to create them, then of course you have to be concerned with their inner experiences. But those experiences only matter because they actually exist.
I truly don’t understand why people use that word in this context or exactly what it’s supposed to, um, mean. But pick pretty much any answer and it’s still true.
My point is that potential parents often care about non-existing people: their potential kids. And once they bring these potential kids into existence, those kids might start caring about a next generation. Simularly, some people/minds will want to expand because that is what their company does, or they would like the experience of exploring a new planet/solar system/galaxy or would like the status of being the first to settle there.
If it’s OK to be not maximal, it will be reflected in The Grand Vision. But if we stay not maximal, it means that immeasurable amount of wonders is doing to not exist because of whatever limited vision you like. This is unfair.