I’m advocating for people to stop talking/thinking as though post-AGI life is a different magisterium from pre-AGI life
Seems undignified to pretend that it isn’t? The balance of forces that make up our world isn’t stable. One way or the other, it’s not going to last. It would certainly be nice, if someone knew how, to arrange for there to be something of human value on the other side. But it’s not a coincidence that the college example is about delaying the phase transition to the other magisterium, rather than expecting as a matter of course that people in technologically mature civilizations will be going to college, even conditional on the somewhat dubious premise that technologically mature civilizations have “people” in them.
The physical world has phase transitions, but it doesn’t have magisteria. ‘Non-overlapping magisteria’, as I’m using the term, is a question about literary genres; about which inferences are allowed to propagate or transfer; about whether a thing feels near-mode or far-mode; etc.
The idea of “going to college” post-AGI sounds silly for two distinct reasons:
The post-singularity world will genuinely be very different from today’s world, and institutions like college are likely to be erased or wildly transformed on relatively short timescales.
The post-singularity world feels like an inherently “far-mode world” where everything that happens is fantastic and large-scale; none of the humdrum minutiae of a single person’s life, ambitions, day-to-day routine, etc. This includes ‘personal goals are near, altruistic goals are far’.
1 is reasonable, but 2 is not.
The original example was about “romantic and reproductive goals”. If the AGI transition goes well, it’s true that romance and reproduction may work radically differently post-AGI, or may be replaced with something wild and weird and new.
But it doesn’t follow from this that we should think of post-AGI-ish goals as a separate magisterium from romantic and reproductive goals. Making the transition to AGI go well is still a good way to ensure romantic and reproductive success (especially qua “long-term goals/flourishing”, as described in the OP), or success on goals that end up mattering even more to you than those things, if circumstances change in such a way that there’s now some crazy, even better posthuman opportunity that you prefer even more.
(I’m assuming here that we shouldn’t optimize goals like “kids get to go to college if they want” in totally qualitatively different ways than we optimize “kids get to go to college if they want, modulo the fact that circumstances might change in ways that bring other values to the fore instead”. I’m deliberately choosing an adorably circa-2022 goal that seems especially unlikely to carry over to a crazy post-AGI world, “college”, because I think the best way to reason about a goal like that is similar to the best way to reason about other goals where it’s more uncertain whether the goal will transfer over to the new phase.)
Seems undignified to pretend that it isn’t? The balance of forces that make up our world isn’t stable. One way or the other, it’s not going to last. It would certainly be nice, if someone knew how, to arrange for there to be something of human value on the other side. But it’s not a coincidence that the college example is about delaying the phase transition to the other magisterium, rather than expecting as a matter of course that people in technologically mature civilizations will be going to college, even conditional on the somewhat dubious premise that technologically mature civilizations have “people” in them.
The physical world has phase transitions, but it doesn’t have magisteria. ‘Non-overlapping magisteria’, as I’m using the term, is a question about literary genres; about which inferences are allowed to propagate or transfer; about whether a thing feels near-mode or far-mode; etc.
The idea of “going to college” post-AGI sounds silly for two distinct reasons:
The post-singularity world will genuinely be very different from today’s world, and institutions like college are likely to be erased or wildly transformed on relatively short timescales.
The post-singularity world feels like an inherently “far-mode world” where everything that happens is fantastic and large-scale; none of the humdrum minutiae of a single person’s life, ambitions, day-to-day routine, etc. This includes ‘personal goals are near, altruistic goals are far’.
1 is reasonable, but 2 is not.
The original example was about “romantic and reproductive goals”. If the AGI transition goes well, it’s true that romance and reproduction may work radically differently post-AGI, or may be replaced with something wild and weird and new.
But it doesn’t follow from this that we should think of post-AGI-ish goals as a separate magisterium from romantic and reproductive goals. Making the transition to AGI go well is still a good way to ensure romantic and reproductive success (especially qua “long-term goals/flourishing”, as described in the OP), or success on goals that end up mattering even more to you than those things, if circumstances change in such a way that there’s now some crazy, even better posthuman opportunity that you prefer even more.
(I’m assuming here that we shouldn’t optimize goals like “kids get to go to college if they want” in totally qualitatively different ways than we optimize “kids get to go to college if they want, modulo the fact that circumstances might change in ways that bring other values to the fore instead”. I’m deliberately choosing an adorably circa-2022 goal that seems especially unlikely to carry over to a crazy post-AGI world, “college”, because I think the best way to reason about a goal like that is similar to the best way to reason about other goals where it’s more uncertain whether the goal will transfer over to the new phase.)