Still the sort of thing that will send people close to the OCD side of the personality spectrum into a spiral of nightmares, which, please note, has apparently already happened in at least two cases. I’m surprised by this, but accept reality. It’s possible we may have more than the usual number of OCD-side-of-the-spectrum people among us.
So, this is the problem that didn’t occur to me. I assumed implicitly that because such things were easy for me to brush off, the same logic would apply to others. Which is kind of silly, because I knew about one of the previous worriers from Benton House.
I think that the bottom line here is that I need to update in favor of greater general caution surrounding anything to do with the singularity, AGI, etc.
Was the discussion in question epistemologicaly interesting (vs. intellectual masturbation)? If so, how many OCD personalities joining the site would call for closing the thread? I am curious about decision criteria. Thanks.
As an aside, I’ve had some SL-related psychological effects, particularly related to material notion of self: a bit of trouble going to sleep, realizing that logically there is little distinction from death-state. This lasted a short while, but then you just learn to “stop worrying and love the bomb”. Besides “time heals all wounds” certain ideas helped, too. (I actually think this is an important SL, though it does not sit well within the SciFi hierarchy).
This worked for me, but I am generally very low on the OCD scale, and I am still mentally not quite ready for some of the discussions going on here.
I’ve had some SL-related psychological effects, particularly related to material notion of self: a bit of trouble going to sleep, realizing that logically there is little distinction from death-state.
There is an upside to this, though. Timelessly speaking, there is nothing special about the moment of your death, since there are always going to be other yous elsewhere that are alive, and there will always be some continuations of any given experience moment that survive. It is very Zen.
Depressive thinking generally focuses on short term issues or general failure. I’m not sure this reflects that. Frankly, it seems to come across superficially at least more like paranoia, especially of the form that one historically saw (and still sees) in some Christians worrying about hell and whether or not they are saved. The reaction to these threads is making me substantially update my estimates both for LW as a rational community and for our ability to discuss issues in a productive fashion.
Yep. But not unexpectedly this time; homung posted in the open thread that he was looking for 20 karma so he could post on the subject, and I sent him a private message saying he shouldn’t, which he either didn’t see or ignored.
If I’m thinking of the right post, it’s another one that involved AI and torture, though from a very different angle than Roko’s post. It was a dialogue between a human and a uFAI; I don’t quite remember what points it was trying to make, but if we’re talking about what could affect people with OCD/anxiety conditions, then it’s probably just the “uFAI talking about torturing people” aspect that was deemed problematic anyway.
but if we’re talking about what could affect people with OCD/anxiety conditions, then it’s probably just the “uFAI talking about torturing people” aspect that was deemed problematic anyway.
Ahh, I remember the one. It was titled “What do you choose? 3^^^3 people being tortured for 50 years or some E. coli in the eye?” uFAIs in counterfactuals do evil things in contrived scenarios. It’s what they do.
A second post has been banned. Strange: it was on a totally different topic from Roko’s.
Still the sort of thing that will send people close to the OCD side of the personality spectrum into a spiral of nightmares, which, please note, has apparently already happened in at least two cases. I’m surprised by this, but accept reality. It’s possible we may have more than the usual number of OCD-side-of-the-spectrum people among us.
So, this is the problem that didn’t occur to me. I assumed implicitly that because such things were easy for me to brush off, the same logic would apply to others. Which is kind of silly, because I knew about one of the previous worriers from Benton House.
I think that the bottom line here is that I need to update in favor of greater general caution surrounding anything to do with the singularity, AGI, etc.
Was the discussion in question epistemologicaly interesting (vs. intellectual masturbation)? If so, how many OCD personalities joining the site would call for closing the thread? I am curious about decision criteria. Thanks.
As an aside, I’ve had some SL-related psychological effects, particularly related to material notion of self: a bit of trouble going to sleep, realizing that logically there is little distinction from death-state. This lasted a short while, but then you just learn to “stop worrying and love the bomb”. Besides “time heals all wounds” certain ideas helped, too. (I actually think this is an important SL, though it does not sit well within the SciFi hierarchy).
This worked for me, but I am generally very low on the OCD scale, and I am still mentally not quite ready for some of the discussions going on here.
It is impossible to have rules without Mr. Potter exploiting them.
There is an upside to this, though. Timelessly speaking, there is nothing special about the moment of your death, since there are always going to be other yous elsewhere that are alive, and there will always be some continuations of any given experience moment that survive. It is very Zen.
Is it OCD or depression? Depression can include (is defined by?) obsessively thinking about things that make one feel worse.
Depressive thinking generally focuses on short term issues or general failure. I’m not sure this reflects that. Frankly, it seems to come across superficially at least more like paranoia, especially of the form that one historically saw (and still sees) in some Christians worrying about hell and whether or not they are saved. The reaction to these threads is making me substantially update my estimates both for LW as a rational community and for our ability to discuss issues in a productive fashion.
(comment edited)
I wonder why PlaidX’s post isn’t getting deleted—the discussion there is way closer to the forbidden topic.
Yep. But not unexpectedly this time; homung posted in the open thread that he was looking for 20 karma so he could post on the subject, and I sent him a private message saying he shouldn’t, which he either didn’t see or ignored.
What was the second topic? I am most interested in knowing just what things are forbidden.
It was about the possibility of torturing someone by creating copies of the person and torturing them.
If I’m thinking of the right post, it’s another one that involved AI and torture, though from a very different angle than Roko’s post. It was a dialogue between a human and a uFAI; I don’t quite remember what points it was trying to make, but if we’re talking about what could affect people with OCD/anxiety conditions, then it’s probably just the “uFAI talking about torturing people” aspect that was deemed problematic anyway.
Ahh, I remember the one. It was titled “What do you choose? 3^^^3 people being tortured for 50 years or some E. coli in the eye?” uFAIs in counterfactuals do evil things in contrived scenarios. It’s what they do.
Oh, wow, I don’t think I saw that one. I guess that makes three banned AI-torture posts, then?
Edit: The one I was thinking of was “Acausal torture? But a scratch, in the multiverse. (A dialogue for human and UFAI)”.