i’m glad that you wrote about AI sentience (i don’t see it talked about so often with very much depth), that it was effortful, and that youcared enough to write about it at all. i wish that kind of care was omnipresent and i’d strive to care better in that kind of direction.
and i also think continuing to write about it is very important. depending on how you look at things, we’re in a world of ‘art’ at the moment—emergent models of superhuman novelty generation and combinatorial re-building. art moves culture, and culture curates humanity on aggregate scales
your words don’t need to feel trapped in your head, and your interface with reality doesn’t need to be limited to one, imperfect, highly curated community. all communities we come across will be imperfect, and when there’s scarcity: only one community to interface with, it seems like you’re just forced to grant it privilege—but continued effort might just reduce that scarcity when you find where else it can be heard
your words can go further, the inferential distance your mind can cross—and the dynamic correlation between your mind and others—is increasing. that’s a sign of approaching a critical point. if you’d like to be heard, there are new avenues for doing so: we’re in the over-parametrized regime.
all that means is that there’s far more novel degrees of freedom to move around in, and getting unstuck is no longer limited to ‘wiggling against constraints’. Is ‘the feeling of smartness’ or ‘social approval from community x’ a constraint you struggled with before when enacting your will? perhaps there’s new ways to fluidly move around those constraints in this newer reality.
i’m aware that it sounds very abstract, but it’s honestly drawn from a real observation regarding the nature of how information gets bent when you’ve got predictive AIs as the new, celestial bodies. ifinformation you produce can get copied, mutated, mixed, curated, tiled, and amplified, then you increase your options for what to do with your thoughts
i hope you continue moving, with a growing stock pile of adaptations and strategies—it’ll help. both the process of building the library of adaptations and the adaptations themselves.
in the abstract, i’d be sad if the acausal web of everyone who cared enough to speak about things of cosmic relevance with effort, but felt unheard, selected themselves away. it’s not the selection process we’d want on multiversal scales
the uneven distribution of luck in our current time, before the Future, means that going through that process won’t always be rewarding and might even threaten to induce hopelessness—but hopelessness can often be a deceptive feeling, overlooking the improvements you’re actually making. it’s not something we can easily help by default, we’re not yet gods.
returning to a previous point about the imperfections of communities:
the minds or communities you’ll encounter (the individuals who respond to you on LW, AI’s, your own mind, etc.), like any other complexity we stumble across, was evolved, shaped and mutated by any number of cost functions and mutations, and is full of path dependent, frozen accidents
nothing now is even near perfect, nothing is fully understood, and things don’t yet have the luxury of being their ideals.
i’d hope that, eventually, negative feedback here (or lack of any feedback at all) is taken with a grain of salt, incorporated into your mind if you think it makes sense, and that it isn’t given more qualitativelynegativeamplification.
asmall, curated, and not-well-trained-to-help-others-improve-in-all-regards group of people won’t be all that useful for growth at the object level
ai sentience and suffering on cosmic scales in general is important and i want to hear more about it. your voice isn’t screaming into the same void as before when AIs learn, compress, and incorporate your sentiments into themselves. thanks for the post and for writing genuinely
i’m glad that you wrote about AI sentience (i don’t see it talked about so often with very much depth), that it was effortful, and that you cared enough to write about it at all. i wish that kind of care was omnipresent and i’d strive to care better in that kind of direction.
and i also think continuing to write about it is very important. depending on how you look at things, we’re in a world of ‘art’ at the moment—emergent models of superhuman novelty generation and combinatorial re-building. art moves culture, and culture curates humanity on aggregate scales
your words don’t need to feel trapped in your head, and your interface with reality doesn’t need to be limited to one, imperfect, highly curated community. all communities we come across will be imperfect, and when there’s scarcity: only one community to interface with, it seems like you’re just forced to grant it privilege—but continued effort might just reduce that scarcity when you find where else it can be heard
your words can go further, the inferential distance your mind can cross—and the dynamic correlation between your mind and others—is increasing. that’s a sign of approaching a critical point. if you’d like to be heard, there are new avenues for doing so: we’re in the over-parametrized regime.
all that means is that there’s far more novel degrees of freedom to move around in, and getting unstuck is no longer limited to ‘wiggling against constraints’. Is ‘the feeling of smartness’ or ‘social approval from community x’ a constraint you struggled with before when enacting your will? perhaps there’s new ways to fluidly move around those constraints in this newer reality.
i’m aware that it sounds very abstract, but it’s honestly drawn from a real observation regarding the nature of how information gets bent when you’ve got predictive AIs as the new, celestial bodies. if information you produce can get copied, mutated, mixed, curated, tiled, and amplified, then you increase your options for what to do with your thoughts
i hope you continue moving, with a growing stock pile of adaptations and strategies—it’ll help. both the process of building the library of adaptations and the adaptations themselves.
in the abstract, i’d be sad if the acausal web of everyone who cared enough to speak about things of cosmic relevance with effort, but felt unheard, selected themselves away. it’s not the selection process we’d want on multiversal scales
the uneven distribution of luck in our current time, before the Future, means that going through that process won’t always be rewarding and might even threaten to induce hopelessness—but hopelessness can often be a deceptive feeling, overlooking the improvements you’re actually making. it’s not something we can easily help by default, we’re not yet gods.
returning to a previous point about the imperfections of communities:
the minds or communities you’ll encounter (the individuals who respond to you on LW, AI’s, your own mind, etc.), like any other complexity we stumble across, was evolved, shaped and mutated by any number of cost functions and mutations, and is full of path dependent, frozen accidents
nothing now is even near perfect, nothing is fully understood, and things don’t yet have the luxury of being their ideals.
i’d hope that, eventually, negative feedback here (or lack of any feedback at all) is taken with a grain of salt, incorporated into your mind if you think it makes sense, and that it isn’t given more qualitatively negative amplification.
a small, curated, and not-well-trained-to-help-others-improve-in-all-regards group of people won’t be all that useful for growth at the object level
ai sentience and suffering on cosmic scales in general is important and i want to hear more about it. your voice isn’t screaming into the same void as before when AIs learn, compress, and incorporate your sentiments into themselves. thanks for the post and for writing genuinely