I also regret contact with the basilisk, but would not say it’s the only information I wish I didn’t know, nor am I entirely sure it was a good idea to censor it.
When it was originally posted I did not take it seriously, it only triggered “severe mental trauma” as others are saying, when I later read someone referring to it being censored, and some curiosity regarding it, and I updated on the fact that it was being taken that seriously by others here.
I do not think the idea holds water, and I feel I owe much of my severe mental trauma to an ongoing anxiety and depression stemming from a host of ordinary factors, isolation chief among them. I would STRONGLY advise everyone in this community to take their mental health more seriously, not so much in terms of basilisks as in terms of being human beings.
This community is, as it stands, ill-equipped to charge forth valiantly into the unknown. It is neurotic at best.
I would also like to apologize for whatever extent I was a player in the early formation of the cauldron of ideas which spawned the basilisk and I’m sure will spawn other basilisks in due time. I participated with a fairly callous abandon in the SL4 threads which prefigure these ideas.
Even at the time it was apparent to anyone paying attention that the general gist of these things was walking a worrisome path, and I basically thought “well, I can see my way clear through these brambles, if other people can’t, that’s their problem.”
We have responsibilities, to ourselves as much as to each other, beyond simply being logical. I have lately been reexamining much of my life, and have taken to practicing meditation. I find it to be a significant aid in combating general anxiety.
...when I later read someone referring to it being censored, and some curiosity regarding it, and I updated on the fact that it was being taken that seriously by others here.
If you join a community concerned with decision theory, are you surprised by the fact that they take problems in decision theory seriously?
There is no expected payoff in harming me just because decision theory implies it being rational. Because I do not follow such procedures. If something wants to waste its resources on it, I win. Because I weaken it. It has to waste resources on me it could use in the dark ages of the universe to support a protégé. And it never received any payoff for this, because I do not play along in any branch that I exist. You see, any decision theory is useless if you deal with agents that don’t care about such. Utility is completely subjective too, as Hume said, “`Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger”. The whole problem in question is just due to the fact that people think that if decision theory implies a strategy to be favorable then you have to follow through on it. Well, no. You can always say, fuck you! The might of God and terrorists is in the mind of their victims.
If you join a community concerned with decision theory, are you surprised by the fact that they take problems in decision theory seriously?
Are they? Are they really? What actual, concrete actions have been taken, or are planned, regarding the basilisk? If people actually make material sacrifices based on having seen the Basilisk then I’m willing to take it seriously, if only for its effects on the human mind.
Then again in the most worrying (or third most worrying, I guess) case, they would likely hide said activities to prevent anything damaging their plans. They could also hide it out of altruism to keep from disturbing halfway smart basilisk seers like us, I guess.
I also regret contact with the basilisk, but would not say it’s the only information I wish I didn’t know, nor am I entirely sure it was a good idea to censor it.
When it was originally posted I did not take it seriously, it only triggered “severe mental trauma” as others are saying, when I later read someone referring to it being censored, and some curiosity regarding it, and I updated on the fact that it was being taken that seriously by others here.
I do not think the idea holds water, and I feel I owe much of my severe mental trauma to an ongoing anxiety and depression stemming from a host of ordinary factors, isolation chief among them. I would STRONGLY advise everyone in this community to take their mental health more seriously, not so much in terms of basilisks as in terms of being human beings.
This community is, as it stands, ill-equipped to charge forth valiantly into the unknown. It is neurotic at best.
I would also like to apologize for whatever extent I was a player in the early formation of the cauldron of ideas which spawned the basilisk and I’m sure will spawn other basilisks in due time. I participated with a fairly callous abandon in the SL4 threads which prefigure these ideas.
Even at the time it was apparent to anyone paying attention that the general gist of these things was walking a worrisome path, and I basically thought “well, I can see my way clear through these brambles, if other people can’t, that’s their problem.”
We have responsibilities, to ourselves as much as to each other, beyond simply being logical. I have lately been reexamining much of my life, and have taken to practicing meditation. I find it to be a significant aid in combating general anxiety.
Also helpful: clonazepam.
If you join a community concerned with decision theory, are you surprised by the fact that they take problems in decision theory seriously?
There is no expected payoff in harming me just because decision theory implies it being rational. Because I do not follow such procedures. If something wants to waste its resources on it, I win. Because I weaken it. It has to waste resources on me it could use in the dark ages of the universe to support a protégé. And it never received any payoff for this, because I do not play along in any branch that I exist. You see, any decision theory is useless if you deal with agents that don’t care about such. Utility is completely subjective too, as Hume said, “`Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger”. The whole problem in question is just due to the fact that people think that if decision theory implies a strategy to be favorable then you have to follow through on it. Well, no. You can always say, fuck you! The might of God and terrorists is in the mind of their victims.
Are they? Are they really? What actual, concrete actions have been taken, or are planned, regarding the basilisk? If people actually make material sacrifices based on having seen the Basilisk then I’m willing to take it seriously, if only for its effects on the human mind. Then again in the most worrying (or third most worrying, I guess) case, they would likely hide said activities to prevent anything damaging their plans. They could also hide it out of altruism to keep from disturbing halfway smart basilisk seers like us, I guess.
I’m pretty sure no one firmly believes in the basilisk simply because everyone who was convinced by it would be spreading it as much as they could.