My understanding of Roko’s is that being exposed to it decreases global utility while exposure to “you should help other people and it’s very important” increases it. But I don’t know if that’s relevant to basilisk status.
There’s a difference between “you should help other people and it’s very important” and “helping other people is so important that you should treat your quality of life as irrelevant”. The latter leads to some combination of burnout, despair, and apathy, though possibly with some useful work done on the way to burnout.
I don’t believe “helping other people is so important that you should treat your quality of life as irrelevant”, because of the negative consequences you describe.
A basilisk, in this context, is a thought that kills you if you think it, which is excessive for this, and for Roko’s. I mean a thought that breaks your cognitive processes in some way if you think it. Which I think is a fair way to describe someone who, on contact with the “you’re murdering everyone you don’t try to save” idea, is consumed with guilt that their every moment is not devoted with maximum effort to saving the world.
How are you using “basilisk”?
My understanding of Roko’s is that being exposed to it decreases global utility while exposure to “you should help other people and it’s very important” increases it. But I don’t know if that’s relevant to basilisk status.
There’s a difference between “you should help other people and it’s very important” and “helping other people is so important that you should treat your quality of life as irrelevant”. The latter leads to some combination of burnout, despair, and apathy, though possibly with some useful work done on the way to burnout.
I don’t believe “helping other people is so important that you should treat your quality of life as irrelevant”, because of the negative consequences you describe.
(I still don’t see a basilisk here.)
You don’t believe that, but
that’s how some people see the utilitarian calculation.
The problem is precisely that people are reluctant to admit that they choose not burning out over helping others.
A basilisk, in this context, is a thought that kills you if you think it, which is excessive for this, and for Roko’s. I mean a thought that breaks your cognitive processes in some way if you think it. Which I think is a fair way to describe someone who, on contact with the “you’re murdering everyone you don’t try to save” idea, is consumed with guilt that their every moment is not devoted with maximum effort to saving the world.