Because basically every cult has a 30 second boilerplate that looks exactly like that?
When I say “discuss safety”, I’m looking for a standard of discussion that is above that provided by actual, known-dangerous cults. Cults routinely use exactly the “check-ins” you’re describing, as a way to emotionally manipulate members. And the “group” check-ins turn in to peer pressure. So the only actual safety valve ANYWHERE in there is (D).
You’re proposing starting something that looks like the cult. I’m asking you for evidence that you are not, in fact, a cult leader. Thus far, almost all evidence you’ve provided has been perfectly in line with “you are a cult leader”.
If you feel this is an unfair standard of discussion, then this is probably not the correct community for you.
Also, this is very important: You’re asking people to sign a legal contract about finances without any way to to terminate the experiment if it turns out you are in fact a cult leader. This is a huge red flag, and you’ve refused to address it.
I’m not interested in entering into a discussion where the standard is “Duncan must overcome an assumption that he’s a cult leader, and bears all the burden of proof.” That’s deeply fucked up, and inappropriate given that I willingly created a multithousand word explanation for transparency and critique, and have positively engaged with all but the bottom 3% of commentary (of which I claim you are firmly a part).
I think you’re flat-out wrong in claiming that “almost all evidence you’ve provided has been perfectly in line with ‘you are a cult leader.’” The whole original post provided all kinds of models and caveats that distinguish it from the (correctly feared and fought-against) standard cult model. You are engaged in confirmation bias and motivated cognition and stereotyping and strawmanning, and you are the one who is failing to rise to the standard of discussion of this community, and I will not back off from saying it however much people might glare at me for it.
I’m not interested in entering into a discussion where the standard is “Duncan must overcome an assumption that he’s a cult leader, and bears all the burden of proof.”
While I agree that a lot of the criticism towards you has been hostile or at least pretty uncharitable, I would only point out that I suspect the default tendency most people have is to automatically reject anything that shows even the most minor outward signs of cultishness, and that these heavy prior beliefs will be difficult to overcome. So, it seems more likely that the standard is “outward signs of cultishness indicate a cult, and cults are really bad” rather than “Duncan is a cult leader.” (This is sort of similar to the criticisms of the rationality community in general).
I think there are a lot of reasons why people have such heavy priors here, and that they aren’t completely unjustified. I myself have them, because I feel that in most cases where I have observed outward signs of cultishness, it turned out these signals were correct in indicating an unhealthy or dangerous situation. I don’t think it’s necessary to go into detail about them because it would take a huge amount of space and we could potentially get into an endless debate about whether these details bear any similarity to the set-up you are proposing.
So it generally seems that your responses to the people who have these very heavy priors against what you are doing to be along the lines of “You can’t just come in here with your heavy priors and expect that they alone constitute valid evidence that my proposal is a bad idea”, and in that regard your rebuttal is valid. However, I do personally feel that, when someone does show up in an argument with very confident prior belief in something, the charitable principle is to assume at least initially that they have a possibly valid chain of evidence and reasoning that led them to that belief.
It could be that there is some social collective knowledge (like a history of shared experiences and reasoning) that led up to this belief, and therefore it is generally expected that we shouldn’t have to back-track through that reasoning chain (therefore allowing us to make confident statements in arguments without producing the evidence). I think that “cults” are a fairly good example of this kind of knowledge—things people almost universally consider bad, except for cult members themselves, so much so that saying otherwise could be considered taboo.
And this is definitely not to claim that every taboo is a justified taboo. It’s also not to say that you haven’t argued well or presented your arguments well. I’m only arguing that it’s going to be an uphill battle against the naysayers, and that to convince them they are wrong would probably require back-tracking through their chain of reasoning that led to their prior belief. In addition, if you find yourself becoming frustrated with them, just keep the above in mind.
For essentially the above reasons, my model predicts that most of the people who decide to participate in this endeavor will be those who trust you and know you very well, and possibly people who know and trust people who know and trust you very well. Secondly, my model also predicts that most of the participants will have done something similar to this already (the military, bootcamps, martial arts dojos, etc.) and successfully made it through them without burning out or getting distressed about the situation. Thus it predicts that people who don’t know you very well or who have never done anything similar to this before are unlikely to participate and are also unlikely to be swayed by the arguments given in favor of it. And even more unfortunately, due to the predicted composition of the participants, we may not be able to learn much about how successful the project will be for people who wouldn’t normally be inclined to participate, and so even if the outcome on the first run is successful, it will still be unlikely to sway those people.
I don’t place much weight on this model right now and I currently expect something like a 30% chance I will need to update it drastically. For example, you might already be receiving a ton of support from people who have never tried this and who don’t know you very well, and that would force me to update right away.
Also, even though I don’t know you personally, I generally feel positively towards the rationality community and feel safe in the knowledge that this whole thing is happening within it, because it means that this project is not too disconnected from the wider community and that you have sufficient dis-incentives from actually becoming a cult-leader.
In short: Don’t let the negativity you are facing become too much of a burden, just keep in mind that it’s possible that many of the most negative critics (besides obvious trolls) are not acting in bad faith, and that it could require more work than is feasible to engage with all of it sufficiently.
Also, this is very important: You’re asking people to sign a legal contract about finances without any way to to terminate the experiment if it turns out you are in fact a cult leader. This is a huge red flag, and you’ve refused to address it.
I would be vastly reassured if you could stop dodging that one single point. I think it is a very valid point, no matter how unfair the rest of my approach may or may not be.
How can you read the parenthetical above and dismiss it as “not discussion” and still claim to be anything other than deontologically hostile?
Because basically every cult has a 30 second boilerplate that looks exactly like that?
When I say “discuss safety”, I’m looking for a standard of discussion that is above that provided by actual, known-dangerous cults. Cults routinely use exactly the “check-ins” you’re describing, as a way to emotionally manipulate members. And the “group” check-ins turn in to peer pressure. So the only actual safety valve ANYWHERE in there is (D).
You’re proposing starting something that looks like the cult. I’m asking you for evidence that you are not, in fact, a cult leader. Thus far, almost all evidence you’ve provided has been perfectly in line with “you are a cult leader”.
If you feel this is an unfair standard of discussion, then this is probably not the correct community for you.
Also, this is very important: You’re asking people to sign a legal contract about finances without any way to to terminate the experiment if it turns out you are in fact a cult leader. This is a huge red flag, and you’ve refused to address it.
I’m not interested in entering into a discussion where the standard is “Duncan must overcome an assumption that he’s a cult leader, and bears all the burden of proof.” That’s deeply fucked up, and inappropriate given that I willingly created a multithousand word explanation for transparency and critique, and have positively engaged with all but the bottom 3% of commentary (of which I claim you are firmly a part).
I think you’re flat-out wrong in claiming that “almost all evidence you’ve provided has been perfectly in line with ‘you are a cult leader.’” The whole original post provided all kinds of models and caveats that distinguish it from the (correctly feared and fought-against) standard cult model. You are engaged in confirmation bias and motivated cognition and stereotyping and strawmanning, and you are the one who is failing to rise to the standard of discussion of this community, and I will not back off from saying it however much people might glare at me for it.
While I agree that a lot of the criticism towards you has been hostile or at least pretty uncharitable, I would only point out that I suspect the default tendency most people have is to automatically reject anything that shows even the most minor outward signs of cultishness, and that these heavy prior beliefs will be difficult to overcome. So, it seems more likely that the standard is “outward signs of cultishness indicate a cult, and cults are really bad” rather than “Duncan is a cult leader.” (This is sort of similar to the criticisms of the rationality community in general).
I think there are a lot of reasons why people have such heavy priors here, and that they aren’t completely unjustified. I myself have them, because I feel that in most cases where I have observed outward signs of cultishness, it turned out these signals were correct in indicating an unhealthy or dangerous situation. I don’t think it’s necessary to go into detail about them because it would take a huge amount of space and we could potentially get into an endless debate about whether these details bear any similarity to the set-up you are proposing.
So it generally seems that your responses to the people who have these very heavy priors against what you are doing to be along the lines of “You can’t just come in here with your heavy priors and expect that they alone constitute valid evidence that my proposal is a bad idea”, and in that regard your rebuttal is valid. However, I do personally feel that, when someone does show up in an argument with very confident prior belief in something, the charitable principle is to assume at least initially that they have a possibly valid chain of evidence and reasoning that led them to that belief.
It could be that there is some social collective knowledge (like a history of shared experiences and reasoning) that led up to this belief, and therefore it is generally expected that we shouldn’t have to back-track through that reasoning chain (therefore allowing us to make confident statements in arguments without producing the evidence). I think that “cults” are a fairly good example of this kind of knowledge—things people almost universally consider bad, except for cult members themselves, so much so that saying otherwise could be considered taboo.
And this is definitely not to claim that every taboo is a justified taboo. It’s also not to say that you haven’t argued well or presented your arguments well. I’m only arguing that it’s going to be an uphill battle against the naysayers, and that to convince them they are wrong would probably require back-tracking through their chain of reasoning that led to their prior belief. In addition, if you find yourself becoming frustrated with them, just keep the above in mind.
For essentially the above reasons, my model predicts that most of the people who decide to participate in this endeavor will be those who trust you and know you very well, and possibly people who know and trust people who know and trust you very well. Secondly, my model also predicts that most of the participants will have done something similar to this already (the military, bootcamps, martial arts dojos, etc.) and successfully made it through them without burning out or getting distressed about the situation. Thus it predicts that people who don’t know you very well or who have never done anything similar to this before are unlikely to participate and are also unlikely to be swayed by the arguments given in favor of it. And even more unfortunately, due to the predicted composition of the participants, we may not be able to learn much about how successful the project will be for people who wouldn’t normally be inclined to participate, and so even if the outcome on the first run is successful, it will still be unlikely to sway those people.
I don’t place much weight on this model right now and I currently expect something like a 30% chance I will need to update it drastically. For example, you might already be receiving a ton of support from people who have never tried this and who don’t know you very well, and that would force me to update right away.
Also, even though I don’t know you personally, I generally feel positively towards the rationality community and feel safe in the knowledge that this whole thing is happening within it, because it means that this project is not too disconnected from the wider community and that you have sufficient dis-incentives from actually becoming a cult-leader.
In short: Don’t let the negativity you are facing become too much of a burden, just keep in mind that it’s possible that many of the most negative critics (besides obvious trolls) are not acting in bad faith, and that it could require more work than is feasible to engage with all of it sufficiently.
I like everything you’ve said here, including the gentle pointers of places where I myself have been uncharitable or naive.
Also, this is very important: You’re asking people to sign a legal contract about finances without any way to to terminate the experiment if it turns out you are in fact a cult leader. This is a huge red flag, and you’ve refused to address it.
I would be vastly reassured if you could stop dodging that one single point. I think it is a very valid point, no matter how unfair the rest of my approach may or may not be.