i feel like letting people try things, with the possibility of rollback from backup, generally works. let people do stuff by default, and when something looks like a person undergoing too much suffering, roll them back (or terminate them, or whatever other ethically viable outcome is closest to what they would want).
maybe pre-emptive “you can’t even try this” would only start making sense if there were concerns that too much experience-time is being filled with people accidentally ending up suffering from unpredictable modifications. (though i suspect i don’t really think this because i’m usually more negative-utilitarian and less average-utilitarian than that)
that said, i’ve never modified my mind in a way that caused me to experience significant suffering. i have a friend who kinda has, by taking LSD and then having a very bad time for the rest of the day, and today-them says they’re glad to have been able to try it. but i think LSD-day-them would strongly disagree.
I’d like the serious modifications to (at the very least) require a lot of effort to do. And be gradual, so you can monitor if you’re going in the right direction, instead of suddenly jumping into a new mindspace. And maybe even collectively decide to forbid some modifications.
The reason that I lean toward relying on my friends, not a godlike entity, is because on default I distrust centralized systems with enormous power. But if we had Elua which is as good as you depicted, I would be okay with that ;)
i tend to dislike such systems as well, but a correctly aligned superintelligence would surely be trustable with anything of the sort. if anything, it would at least know about the ways it could fail at this, and tell us about what it knows of those possibilities.
i feel like letting people try things, with the possibility of rollback from backup, generally works. let people do stuff by default, and when something looks like a person undergoing too much suffering, roll them back (or terminate them, or whatever other ethically viable outcome is closest to what they would want).
maybe pre-emptive “you can’t even try this” would only start making sense if there were concerns that too much experience-time is being filled with people accidentally ending up suffering from unpredictable modifications. (though i suspect i don’t really think this because i’m usually more negative-utilitarian and less average-utilitarian than that)
that said, i’ve never modified my mind in a way that caused me to experience significant suffering. i have a friend who kinda has, by taking LSD and then having a very bad time for the rest of the day, and today-them says they’re glad to have been able to try it. but i think LSD-day-them would strongly disagree.
Yeah, that makes sense.
I’d like the serious modifications to (at the very least) require a lot of effort to do. And be gradual, so you can monitor if you’re going in the right direction, instead of suddenly jumping into a new mindspace. And maybe even collectively decide to forbid some modifications.
(btw, here is a great story about hedonic modification https://www.utilitarianism.com/greg-egan/Reasons-To-Be-Cheerful.pdf)
The reason that I lean toward relying on my friends, not a godlike entity, is because on default I distrust centralized systems with enormous power. But if we had Elua which is as good as you depicted, I would be okay with that ;)
thanks for the egan story, it was pretty good!
i tend to dislike such systems as well, but a correctly aligned superintelligence would surely be trustable with anything of the sort. if anything, it would at least know about the ways it could fail at this, and tell us about what it knows of those possibilities.