I used to be heavily into this area, and after succumbing somewhat to an ‘it all adds up to normality’ shoulder-shrugging, my feeling on this is that it’s not just the ‘environment’ that is subject to radical changes, but the mind itself. It might be that there’s a kind of mind-state attractor, by which minds tend to move along predictable paths and converge upon weirdness together. All of consciousness may, by different ways of looking at it, be considered as fragments of that endstate.
Imagine a benevolent AI on a universal scale, that simulates the greates achievable number of copies of one specific “life”. Namely if we imagine that it would simulate cintinuous states from emergence of consciousness to some form of nirvana. If we assume during brain death experience is getting simpler, to eventually reach the simplest observer moment (it would be identical to all dying minds) we can ask ourselves what than the next observer moment should be, and if we already have the simplest one next should be more complex, maybe if the complexity would have a tendency to grow, next moments would be one in an emerging mind of some creature (it would be a form of multiverse reincarnation, yet there is no way to keep memory in such a scenario).
We could imagine that some benevolent AI would create greater (from the simplest state to computationally achievable, simple nirvanic state) measure of one simple, suffering-less life, to minimize the amount of suffering, what would be a form of mind attractor.
Nevertheless after considering the idea I think it has a great objection, namely it would not be a way to save anyone, because there would be no “person” to be saved.
I used to be heavily into this area, and after succumbing somewhat to an ‘it all adds up to normality’ shoulder-shrugging, my feeling on this is that it’s not just the ‘environment’ that is subject to radical changes, but the mind itself. It might be that there’s a kind of mind-state attractor, by which minds tend to move along predictable paths and converge upon weirdness together. All of consciousness may, by different ways of looking at it, be considered as fragments of that endstate.
Imagine a benevolent AI on a universal scale, that simulates the greates achievable number of copies of one specific “life”. Namely if we imagine that it would simulate cintinuous states from emergence of consciousness to some form of nirvana. If we assume during brain death experience is getting simpler, to eventually reach the simplest observer moment (it would be identical to all dying minds) we can ask ourselves what than the next observer moment should be, and if we already have the simplest one next should be more complex, maybe if the complexity would have a tendency to grow, next moments would be one in an emerging mind of some creature (it would be a form of multiverse reincarnation, yet there is no way to keep memory in such a scenario). We could imagine that some benevolent AI would create greater (from the simplest state to computationally achievable, simple nirvanic state) measure of one simple, suffering-less life, to minimize the amount of suffering, what would be a form of mind attractor.
Nevertheless after considering the idea I think it has a great objection, namely it would not be a way to save anyone, because there would be no “person” to be saved.
(Excuse my englisch)