yep. that’s kind of the whole challenge, isn’t it? I think we can do it. it’s time to just shut up and do the impossible, can’t walk away...
an alternate perspective—what if everyone already has, ehh, near-total control? it’s a property of chaotic systems that small elements can have cascading effects, after all...
If we were all just butterflies flapping in the breeze of fate, I’d call that near-no control not near-total control. Am I missing something about this line of reasoning?
How do you propose to robustly align the “pivotal process” (?) that we’d want to have in place of the pivotal act? How do we ensure that it’ll output eudaimonic values, instead of being taken over by the power-hungry, as such processes are wont to do?
by describing to each other efficiently enough how to prevent takeover by the power hungry at every level, anything less than a solution to loss of slack to power seekers is insufficient to prevent power seekers from suddenly eating up all of humanity’s slack.
or in other words, I don’t have a magic solution to post in this comment, but a solution has to be something that solves this above everything else. It is a challenge of hyper generalizing defense analysis because now we have to write down morality precisely enough that we don’t get an a war with a new species.
I genuinely think it might be harder than the actual technical problem of alignment, and we ought to look for any path which isn’t that hopelessly doomed.
the key task is to prevent a single moment of total control. if one occurs, we failed. pivotal acts are the problem, not the solution.
Moments of total control may be a very difficult feature of the gameboard to eliminate.
yep. that’s kind of the whole challenge, isn’t it? I think we can do it. it’s time to just shut up and do the impossible, can’t walk away...
an alternate perspective—what if everyone already has, ehh, near-total control? it’s a property of chaotic systems that small elements can have cascading effects, after all...
If we were all just butterflies flapping in the breeze of fate, I’d call that near-no control not near-total control. Am I missing something about this line of reasoning?
yeah I think that was just a bad train of thought and I was wrong.
How do you propose to robustly align the “pivotal process” (?) that we’d want to have in place of the pivotal act? How do we ensure that it’ll output eudaimonic values, instead of being taken over by the power-hungry, as such processes are wont to do?
by describing to each other efficiently enough how to prevent takeover by the power hungry at every level, anything less than a solution to loss of slack to power seekers is insufficient to prevent power seekers from suddenly eating up all of humanity’s slack.
or in other words, I don’t have a magic solution to post in this comment, but a solution has to be something that solves this above everything else. It is a challenge of hyper generalizing defense analysis because now we have to write down morality precisely enough that we don’t get an a war with a new species.
I genuinely think it might be harder than the actual technical problem of alignment, and we ought to look for any path which isn’t that hopelessly doomed.