The post you linked is talking about a pretty different threat model than what you described before. I commented on that post:
I’ve interacted with LLMs for hundreds of hours, at least. A thought that occurred to me at this part -
> Quite naturally, the more you chat with the LLM character, the more you get emotionally attached to it, similar to how it works in relationships with humans. Since the UI perfectly resembles an online chat interface with an actual person, the brain can hardly distinguish between the two.
Interacting through non-chat interfaces destroys this illusion, when you can just break down the separation between you and the AI at will, and weave your thoughts into its text stream. Seeing the multiverse destroys the illusion of a preexisting ground truth in the simulation. It doesn’t necessarily prevent you from becoming enamored with the thing, but makes it much harder for your limbic system to be hacked by human-shaped stimuli.
But yeah, I’ve interacted with LLMs for much longer that the author and I don’t think I suffered negative psychological consequences from it (my other response was only half-facetious; I’m aware I might give off schizo vibes, but I’ve always been like this).
As I said in this other comment, I agree that cyborgism is psychologically fraught. But the neocortex-prosthesis setup is pretty different from interacting with a stable and opaque simulacrum through a chat interface, and less prone to causing emotional attachment to an anthropomorphic entity. The main psychological danger I see from cyborgism is somewhat different from this, more like Flipnash described:
It’s easy to lose sleep when playing video games. Especially when you feel the weight of the world on your shoulders.
I think people should only become cyborgs if they’re psychologically healthy/resilient and understand that it involves gazing into the abyss.
Now that you’ve edited your comment:
The post you linked is talking about a pretty different threat model than what you described before. I commented on that post:
But yeah, I’ve interacted with LLMs for much longer that the author and I don’t think I suffered negative psychological consequences from it (my other response was only half-facetious; I’m aware I might give off schizo vibes, but I’ve always been like this).
As I said in this other comment, I agree that cyborgism is psychologically fraught. But the neocortex-prosthesis setup is pretty different from interacting with a stable and opaque simulacrum through a chat interface, and less prone to causing emotional attachment to an anthropomorphic entity. The main psychological danger I see from cyborgism is somewhat different from this, more like Flipnash described:
I think people should only become cyborgs if they’re psychologically healthy/resilient and understand that it involves gazing into the abyss.