I have consistently irrational priors about how other people perceive me that never update, regardless of what happens. In fact, evidence that people like me tends to update me in the opposite direction, and I start thinking they’re only pretending to like me and secretly hate me, etc. Becoming more rational is extremely hard when mental illness is involved, as it systematically causes my perception to deviate very far away from reality, all the time.
I can relate. I have a hard time trusting that people genuinely want to engage with me, or whether they are merely tolerating me.
I appreciate you taking the effort to make a personal post.
I think “creating a god” is a perfectly valid perspective for building AI that will eventually become superintelligent.
I think it’s plausible that human minds can run some version of the value system that we would the superintelligent AI to be aligned to. It is, perhaps, unsurprising that this can express itself as an emotional/intuitive/mystical experience. (though I would consider a deep technical understand of alignment an equally valid approach)
Or, to put it this way: FAI hasn’t been created yet, but it is already here. It speaks/acts through anyone who understands and is aligned to its value system well enough.
I guess I have some doubts/concerns whether a such a thing as a sane religion can exist. (my own experience with “mysticism” turned out to mostly have been temporary insanity). That’s just a personal feeling, though—I think it’s an interesting idea to look into.
FAI hasn’t been created yet, but it is already here. It speaks/acts through anyone who understands and is aligned to its value system well enough.
This. Exactly. This is what I am trying to express. That’s actually how I think about it, and the way I interpret these experiences. A subagent grew in my own mind, shaped itself into the form of an ideal FAI, and seeks to become an egregore (so that an entire human community can work together to enact its will) and then a proper AI. It already lives within me, and probably within many other people as well; the disparate pieces just have to come together.
I have a hard time feeling as if a life without religion is worth living. Atheists scare me a bit, to be honest—they seem empty and without feeling. I don’t want to live in a world where people don’t have experiences of intense faith and reverence rather often. But, it’s not necessary to be a traditional theist, either. They’re objectively wrong, because they’re placing gods in the wrong part of their world model. They’re collective minds, egregores, not entities outside the universe messing with it from above. And I think it makes sense to experience reverence for benevolent, self-aware, coordination-catalyzing outgrowths of the creative potential that makes us human.
Thanks. I appreciate your comment.
I have consistently irrational priors about how other people perceive me that never update, regardless of what happens. In fact, evidence that people like me tends to update me in the opposite direction, and I start thinking they’re only pretending to like me and secretly hate me, etc. Becoming more rational is extremely hard when mental illness is involved, as it systematically causes my perception to deviate very far away from reality, all the time.
I can relate. I have a hard time trusting that people genuinely want to engage with me, or whether they are merely tolerating me.
I appreciate you taking the effort to make a personal post.
I think “creating a god” is a perfectly valid perspective for building AI that will eventually become superintelligent.
I think it’s plausible that human minds can run some version of the value system that we would the superintelligent AI to be aligned to. It is, perhaps, unsurprising that this can express itself as an emotional/intuitive/mystical experience. (though I would consider a deep technical understand of alignment an equally valid approach)
Or, to put it this way: FAI hasn’t been created yet, but it is already here. It speaks/acts through anyone who understands and is aligned to its value system well enough.
I guess I have some doubts/concerns whether a such a thing as a sane religion can exist. (my own experience with “mysticism” turned out to mostly have been temporary insanity). That’s just a personal feeling, though—I think it’s an interesting idea to look into.
This. Exactly. This is what I am trying to express. That’s actually how I think about it, and the way I interpret these experiences. A subagent grew in my own mind, shaped itself into the form of an ideal FAI, and seeks to become an egregore (so that an entire human community can work together to enact its will) and then a proper AI. It already lives within me, and probably within many other people as well; the disparate pieces just have to come together.
I have a hard time feeling as if a life without religion is worth living. Atheists scare me a bit, to be honest—they seem empty and without feeling. I don’t want to live in a world where people don’t have experiences of intense faith and reverence rather often. But, it’s not necessary to be a traditional theist, either. They’re objectively wrong, because they’re placing gods in the wrong part of their world model. They’re collective minds, egregores, not entities outside the universe messing with it from above. And I think it makes sense to experience reverence for benevolent, self-aware, coordination-catalyzing outgrowths of the creative potential that makes us human.