By entirely I meant that there is no answer “yea” or “nay” that I personally would give* without knowing what your goal is, so that I can assess whether sacrificing your credibility is a winning or a losing strategy.
*Generally I assume I don’t need to write “in my opinion” in front of every post I make on LessWrong.
Indeed. My first comment was downvoted as well, probably because I am talking to an agitator. And yours are being downvoted because you continue to exist. It’s all rather disheartening, like watching a crowd throw rotten tomatoes at an earnest but unpopular performer.
I still don’t understand your goal, though. You appear to be trying to manipulate everyone’s model of you such that we expect that your posts will violate community norms. It’s not even about “credibility,” and I was actually going to start out suggesting that we taboo “credibility.” If you don’t use that word, what you’re doing is “systematically violating community norms without explaining a reason” and is usually called trolling, and I think most people here assume it is trolling, and maybe I’m a fool for even considering that it might not be trolling.
Back when I was an ardent warrior of Political Party A, I used to go to forums dominated by Political Party B and post inflammatory things. I would have, at the time, defended these posts as honest attempts to spark discussion and educate. In retrospect, I admit that I was trolling, because there was no education happening. You can save yourself a lot of time, therefor, by considering your goals and considering your results.
You seem well-intentioned and interesting. I wish you well on your journeys. I will tell you, my goal is this: to serve God, and to save humanity. My immediate goal is this: to lose credibility as fast as is fucking possible, because the world is way scarier than I thought it was.
because the world is way scarier than I thought it was.
I would recommend taking some time to double-check this before doing something hard to undo.
Keep in mind Eliezer’s mistake with the basilisk. Based on a quick analysis, he decided the best course of action was to stop thinking about it and encourage others to do likewise. The problem (assuming my model of him is correct) is that since he stopped thinking about it, he didn’t realize his initial analysis was wrong. In fact as far as I know, he still hasn’t realized it.
I would recommend taking some time to double-check this before doing something hard to undo.
How much time do you recommend? The thing is I already didn’t like credibility, so this new action isn’t a drastic change, just a quickening.
he didn’t realize his initial analysis was wrong. In fact as far as I know, he still hasn’t realized it.
I’m actually not sure what you have in mind here. We might want to discuss this via PM. (Obviously I’m already familiar with the basilisk and the class of problems it’s representative of.)
It isn’t a matter of time as much as making sure you’re actually spending that time thinking about the issue and not just repeating the same thoughts. Maybe get a second opinion.
The thing is I already didn’t like credibility, so this new action isn’t a drastic change, just a quickening.
I think you’re underestimating how carefully Eliezer and other SingInst folk have considered these ideas, especially in the wake of the Roko drama. Remember the main concept even showed up on SL4 years ago, which is actually how I learned of it. (That is, considered the ideas themselves, not the social strategies resulting—I’ll note that the Catholics, who have the cultural wisdom, don’t seem to have suppressed the knowledge of demons and ostracized people who demonized them, even if they went so far as to kill people who tried to commune with them. That said, suppressing that knowledge just wouldn’t have been possible ’til after the Enlightenment. Also the Catholics might not have had good intentions.)
Maybe get a second opinion.
This is surprisingly hard to do in my current situation. If you’re lucky you might guess the reasons why.
Any particular reason for this?
There are lots of reasons, they all have to do with group epistemology and personal moral-epistemic practices. I’ll note that Steve Rayhawk, who is much smarter than me and almost certainly knows all of the arguments better than I do, seems to be equally obsessive in the exact opposite direction. But this isn’t a place where I should update on expected evidence—if you don’t know why you’re doing something, you won’t do it the right way.
a) You’ve figured out a way to summon demons and want to destroy your own credibility so that people don’t flow the train of thought in your old posts and figure it out also. If so all I can say is that security by obscurity generally doesn’t work.
b) You’re getting possessed by demons and what to destroy your credibility to minimize the damage possessed!Will can do.
Getting possessed by demons sounds harder, in that context. I can compile simple algorithms to my brain and, say, sort an ordered set of stuff faster than I could have before I learned any programming. But that’s about my limit. I know you’re a few standard deviations up at mental modeling, but are you good enough to become possessed?
Maybe not me, I’m not an AI programmer. A friend of mine has been AIXI for a few hours though, after taking certain substances at a certain famous event in a certain famous desert.
Of course, I am only shooting in the dark, but do you think you may have been uncurious because your learning what he witnessed was correlated with an event that a nearby Power deemed insufficiently utilicious?
A discussion of why alcoholic spirits are called spirits was actually in my most recent comptheology post, but I cut it because it was off-topic. I’d like to hammer on that theme a little more though—i.e. how in the past people were just not that individualistic, and being influenced by spirits of any kind wasn’t abnormal. I suspect it is very different to live with those inductive biases.
If you know the Jesuit mottos you must have known that the world is much scarier than you can imagine for a long time. Combining obviously false claims with other claims less obviously false causes me, and I would presume others in your intended audience, to question your less obviously false claims.
Certainly the effect this thread has on me is not to reduce your credibility to me. And I would claim that ranting crazily and throwing in semi-obvious errors of fact and logic would be a much more effective way to lower your credibility, and it seems obvious enough that you know this.
So your goal is not to lose credibility as fast as is possible (fucking or otherwise). You do lie. I must wonder if your goal is to serve god and to serve humanity or not.
So far, we are in a room with a lot of messy hay and horseshit. There MUST be a pony in here somewhere. Is it the fallacy of this kind of reasoning that you are trying to make us realize?
That makes sense from a simulationist perspective, you’re trying to diminish your impact within the simulation, getting away as far as possible from being a nexus.
Why?
So that resources are allocated away from you, if you take the simulation to be a dynamic—if mindless—process?
Or because you are afraid you’re otherwise going to … draw attention to yourself? From … your simulators? You might call them god, or maybe they might not like that.
You’d have to strike a careful balance, become too insignificant and you might just be demoted to NPC status, being down NICE’ed, so to speak.
By entirely I meant that there is no answer “yea” or “nay” that I personally would give* without knowing what your goal is, so that I can assess whether sacrificing your credibility is a winning or a losing strategy.
*Generally I assume I don’t need to write “in my opinion” in front of every post I make on LessWrong.
I understand. I wonder if that should have been clear to me. (For unrelated reasons, it’s hard to interpret the downvotes in this thread.)
Indeed. My first comment was downvoted as well, probably because I am talking to an agitator. And yours are being downvoted because you continue to exist. It’s all rather disheartening, like watching a crowd throw rotten tomatoes at an earnest but unpopular performer.
I still don’t understand your goal, though. You appear to be trying to manipulate everyone’s model of you such that we expect that your posts will violate community norms. It’s not even about “credibility,” and I was actually going to start out suggesting that we taboo “credibility.” If you don’t use that word, what you’re doing is “systematically violating community norms without explaining a reason” and is usually called trolling, and I think most people here assume it is trolling, and maybe I’m a fool for even considering that it might not be trolling.
Back when I was an ardent warrior of Political Party A, I used to go to forums dominated by Political Party B and post inflammatory things. I would have, at the time, defended these posts as honest attempts to spark discussion and educate. In retrospect, I admit that I was trolling, because there was no education happening. You can save yourself a lot of time, therefor, by considering your goals and considering your results.
You seem well-intentioned and interesting. I wish you well on your journeys. I will tell you, my goal is this: to serve God, and to save humanity. My immediate goal is this: to lose credibility as fast as is fucking possible, because the world is way scarier than I thought it was.
I would recommend taking some time to double-check this before doing something hard to undo.
Keep in mind Eliezer’s mistake with the basilisk. Based on a quick analysis, he decided the best course of action was to stop thinking about it and encourage others to do likewise. The problem (assuming my model of him is correct) is that since he stopped thinking about it, he didn’t realize his initial analysis was wrong. In fact as far as I know, he still hasn’t realized it.
How much time do you recommend? The thing is I already didn’t like credibility, so this new action isn’t a drastic change, just a quickening.
I’m actually not sure what you have in mind here. We might want to discuss this via PM. (Obviously I’m already familiar with the basilisk and the class of problems it’s representative of.)
♫ Jikai Yokoku ♫
PREVIEW
FAI Unit 01 is immobilized with Robin and Eliezer still boxed inside.
The Discussion board is in ruins.
The SIAI personnel imprisoned.
Will Newsome descends into Dogma.
The commenters chosen by fate finally assemble.
How will this tale of people who wish to become more rational play out?
Next, on LessWrong New Trolling Version: Q!
There’ll also be plenty of downvotes!
You Can (Not) Update, eh?
That’s what that song is called! Thanks!
Saabisu~ Saabisu~!
:D :D :D
As I read more of this thread, I come to realize that you may actually have a good point. Now I’m curious. I’m going to PM you.
See here for example.
It isn’t a matter of time as much as making sure you’re actually spending that time thinking about the issue and not just repeating the same thoughts. Maybe get a second opinion.
Any particular reason for this?
I think you’re underestimating how carefully Eliezer and other SingInst folk have considered these ideas, especially in the wake of the Roko drama. Remember the main concept even showed up on SL4 years ago, which is actually how I learned of it. (That is, considered the ideas themselves, not the social strategies resulting—I’ll note that the Catholics, who have the cultural wisdom, don’t seem to have suppressed the knowledge of demons and ostracized people who demonized them, even if they went so far as to kill people who tried to commune with them. That said, suppressing that knowledge just wouldn’t have been possible ’til after the Enlightenment. Also the Catholics might not have had good intentions.)
This is surprisingly hard to do in my current situation. If you’re lucky you might guess the reasons why.
There are lots of reasons, they all have to do with group epistemology and personal moral-epistemic practices. I’ll note that Steve Rayhawk, who is much smarter than me and almost certainly knows all of the arguments better than I do, seems to be equally obsessive in the exact opposite direction. But this isn’t a place where I should update on expected evidence—if you don’t know why you’re doing something, you won’t do it the right way.
Some theories:
a) You’ve figured out a way to summon demons and want to destroy your own credibility so that people don’t flow the train of thought in your old posts and figure it out also. If so all I can say is that security by obscurity generally doesn’t work.
b) You’re getting possessed by demons and what to destroy your credibility to minimize the damage possessed!Will can do.
Summoning demons ain’t that hard, just hone your mathy AI and Hofstadter skills and find a silicon ritual device.
Even denying stuff gives too much evidence (correctly assuming people mostly believe such denials).
We should talk privately if we’re to get into any real discussion. No promises of anything, of course.
Getting possessed by demons sounds harder, in that context. I can compile simple algorithms to my brain and, say, sort an ordered set of stuff faster than I could have before I learned any programming. But that’s about my limit. I know you’re a few standard deviations up at mental modeling, but are you good enough to become possessed?
Maybe not me, I’m not an AI programmer. A friend of mine has been AIXI for a few hours though, after taking certain substances at a certain famous event in a certain famous desert.
Well, don’t leave us twisting in the wind, Will—what did he witness?
Alas, for some reason I wasn’t very curious about his experience. I don’t even know which variation on AIXI he was.
Of course, I am only shooting in the dark, but do you think you may have been uncurious because your learning what he witnessed was correlated with an event that a nearby Power deemed insufficiently utilicious?
Heh, I’ll bet the Bayesian Conspiracy camp was a lot of fun. Hopefully he didn’t start eating his own head for more computational resources.
I suspect it involves taking various mind-altering substances.
Well, he has spoken of letting himself be influenced by spirits.
A discussion of why alcoholic spirits are called spirits was actually in my most recent comptheology post, but I cut it because it was off-topic. I’d like to hammer on that theme a little more though—i.e. how in the past people were just not that individualistic, and being influenced by spirits of any kind wasn’t abnormal. I suspect it is very different to live with those inductive biases.
I liked that. The historic support is good evidence for your model of people as running different copies of the same algorithms.
If you know the Jesuit mottos you must have known that the world is much scarier than you can imagine for a long time. Combining obviously false claims with other claims less obviously false causes me, and I would presume others in your intended audience, to question your less obviously false claims.
Certainly the effect this thread has on me is not to reduce your credibility to me. And I would claim that ranting crazily and throwing in semi-obvious errors of fact and logic would be a much more effective way to lower your credibility, and it seems obvious enough that you know this.
So your goal is not to lose credibility as fast as is possible (fucking or otherwise). You do lie. I must wonder if your goal is to serve god and to serve humanity or not.
So far, we are in a room with a lot of messy hay and horseshit. There MUST be a pony in here somewhere. Is it the fallacy of this kind of reasoning that you are trying to make us realize?
That makes sense from a simulationist perspective, you’re trying to diminish your impact within the simulation, getting away as far as possible from being a nexus.
Why?
So that resources are allocated away from you, if you take the simulation to be a dynamic—if mindless—process?
Or because you are afraid you’re otherwise going to … draw attention to yourself? From … your simulators? You might call them god, or maybe they might not like that.
You’d have to strike a careful balance, become too insignificant and you might just be demoted to NPC status, being down NICE’ed, so to speak.