I am not an expert on this, but I think the kind of person I have in mind would not bother to look for willing BDSM victims. From their point of view, there are humans all around, and their consent is absolutely irrelevant, so they would optimize for some other criteria instead.
This feels to me like worrying about a vegetarian who eats “soy meat” because it exposes their unconscious meat-eating desire, while there are real carnivores out there.
specify which kind of people you want to remove from the community
I am not even sure if “removing a kind of people” is the correct approach. (Fictional evidence says no.) My best guess at this moment would be to create a community where people are more open to each other, so when some person harms another person, they are easily detected, especially if they have a pattern. Which also has a possible problem with false reporting; which maybe also could be solved by noticing patterns.
Speaking about society in general, we have an experience that sociopaths are likely to gain power in different kinds of organizations. It would be naive to expect that rationalist communities would be somehow immune to this; especially if we start “winning” in the real world. Sociopaths have an additional natural advantage that they have more experience dealing with neurotypicals, than neurotypicals have with dealing with sociopaths.
I think someone should at least try to solve this problem, instead of pretending it doesn’t exist or couldn’t happen to us. Because it’s just a question of time.
I am not an expert on this, but I think the kind of person I have in mind would not bother to look for willing BDSM victims. From their point of view, there are humans all around, and their consent is absolutely irrelevant, so they would optimize for some other criteria instead.
Human beings frequently like to think of people they don’t like and understand as evil. There various very bad mental habits associated with it.
Academic psychology is a thing. It actually describes how certain people act. It describes how psychopaths acts. They aren’t just evil. Their emotional processes is screwed in systematic ways.
My best guess at this moment would be to create a community where people are more open to each other, so when some person harms another person, they are easily detected, especially if they have a pattern.
Translated into every day language that’s: “Rationalists should gossip more about each other.”
Whether we should follow that maxime is a quite complex topic on it’s own and if you think that’s important write an article about it and actually address the reasons why people don’t like to gossip.
I think someone should at least try to solve this problem, instead of pretending it doesn’t exist or couldn’t happen to us.
You are not really addressing what I said. It’s very likely that we have people in this community who fulfill the criteria of clinical psychopathy and I also remember an account of a person who said they trusted another person from a LW meetup who was a self declared egoist too much and ended up with a bad interaction because they didn’t take the openness the person who said that they only care about themselves at face value.
Given your moderator position, do you think that you want to do something to garden but lack power at the moment? Especially dealing with the obvious case?
If so, that’s a real concern. Probably worth addressing more directly.
Unfortunately, I don’t feel qualified enough to write an article about this, nor to analyze the optimal form of gossip. I don’t think I have a solution. I just noticed a danger, and general unwillingness to debate it.
Probably the best thing I can do right now is to recommend good books on this topic. That would be:
The Mask of Sanity by Hervey M. Cleckley; specifically the 15 examples provided; and
People of the Lie by M. Scott Peck; this book is not scientific, but is much easier to read
I admit I do have some problems with moderating (specifically, the reddit database is pure horror, so it takes a lot of time to find anything), but my motivation for writing in this thread comes completely from offline life.
As a leader of my local rationalist community, I was wondering about the things that could happen if the community becomes greater and more successful. Like, if something bad happened within the community, I would feel personally responsible for the people I have invited there by visions of rationality and “winning”. (And “something bad” offline can be much worse than mere systematic downvoting.) Especially if we would achieve some kind of power in real life, which is what I hope to do one day. I want to do something better than just bring a lot of enthusiastic people to one place and let the fate decide. I trust myself not to start a cult, and not to abuse others, but that itself is no reason for others to trust me; and also, someone else may replace me (rather easily, since I am not good at coalition politics); or someone may do evil things under my roof, without me even noticing. Having a community of highly intelligent people has the risk that the possible sociopaths, if they come, will likely also be highly intelligent. So, I am thinking about what makes a community safe or unsafe. Because if the community grows large enough, sooner or later problems start happening. I would rather be prepared in advance. Trying to solve the problem ad-hoc would probably totally seem like a personal animosity or joining one faction in an internal conflict.
In the ideal world we could fully trust all people in our tribe to do nothing bad. Simply because we have known a people for years we could trust a person to do good.
That’s no rational heuristic. Our world is not structured in a way where the amount of time we know a person is a good heuristic for the amount of trust we can give that person.
There are a bunch of people I meet in the topic of personal development whom I trust very easily because I know the heuristics that those people use.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn’t have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
But if you use that as a criteria for kicking people out you people won’t be open about their own beliefs anymore.
In general trusting people a lot who tick half of the criterias that constitute clinical psychopathy isn’t a good idea.
On the other hand LW is per default inclusive and not structured in a way where it’s a good idea to kick out people on such a basis.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn’t have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
Intelligent sociopaths generally don’t go around telling people that they’re sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of. I have heard people saying similar things before, but they’ve generally been confused teenagers, Internet Tough Guys, and a few people who’re just really bad at recognizing their own emotions—who also aren’t the best people to trust, granted, but for different reasons.
I’d be more worried about people who habitually underestimate the empathy of others and don’t have obviously poor self-image or other issues to explain it. Most of the sociopaths I’ve met have had a habit of assuming those they interact with share, to some extent, their own lack of empathy: probably typical-mind fallacy in action.
Intelligent sociopaths generally don’t go around telling people that they’re sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of.
The usually won’t say it in a way that the would predict will put other people on guard. On the other hand that doesn’t mean that they don’t say it at all.
I don’t find the link at the moment but a while ago someone posted on LW that he shouldn’t have trusted another person from a LW meetup who openly said those things and then acted like that.
Categorising Internet Tough Guys is hard. Base rates for psychopathy aren’t that low but you are right that not everyone who says those things is a psychopath.
Even that it’s a signal for not giving full trust to that person.
My best guess at this moment would be to create a community where people are more open to each other, so when some person harms another person, they are easily detected, especially if they have a pattern.
What do you mean by “harm”. I have to ask because there is a movement (commonly called SJW) pushing an insanely broad definition of “harm”. For example, if you’ve shattered someone’s worldview have you “harmed” him?
if you’ve shattered someone’s worldview have you “harmed” him?
Not per se, although there could be some harm in the execution. For example if I decide to follow someone every day from their work screaming at them “Jesus is not real”, the problem is with me following them every day, not with the message. Or, if they are at a funeral of their mother and the priest is saying “let’s hope we will meet our beloved Jane in heaven with Jesus”, that would not be a proper moment to jump and scream “Jesus is not real”.
I am not an expert on this, but I think the kind of person I have in mind would not bother to look for willing BDSM victims. From their point of view, there are humans all around, and their consent is absolutely irrelevant, so they would optimize for some other criteria instead.
This feels to me like worrying about a vegetarian who eats “soy meat” because it exposes their unconscious meat-eating desire, while there are real carnivores out there.
I am not even sure if “removing a kind of people” is the correct approach. (Fictional evidence says no.) My best guess at this moment would be to create a community where people are more open to each other, so when some person harms another person, they are easily detected, especially if they have a pattern. Which also has a possible problem with false reporting; which maybe also could be solved by noticing patterns.
Speaking about society in general, we have an experience that sociopaths are likely to gain power in different kinds of organizations. It would be naive to expect that rationalist communities would be somehow immune to this; especially if we start “winning” in the real world. Sociopaths have an additional natural advantage that they have more experience dealing with neurotypicals, than neurotypicals have with dealing with sociopaths.
I think someone should at least try to solve this problem, instead of pretending it doesn’t exist or couldn’t happen to us. Because it’s just a question of time.
Human beings frequently like to think of people they don’t like and understand as evil. There various very bad mental habits associated with it.
Academic psychology is a thing. It actually describes how certain people act. It describes how psychopaths acts. They aren’t just evil. Their emotional processes is screwed in systematic ways.
Translated into every day language that’s: “Rationalists should gossip more about each other.” Whether we should follow that maxime is a quite complex topic on it’s own and if you think that’s important write an article about it and actually address the reasons why people don’t like to gossip.
You are not really addressing what I said. It’s very likely that we have people in this community who fulfill the criteria of clinical psychopathy and I also remember an account of a person who said they trusted another person from a LW meetup who was a self declared egoist too much and ended up with a bad interaction because they didn’t take the openness the person who said that they only care about themselves at face value.
Given your moderator position, do you think that you want to do something to garden but lack power at the moment? Especially dealing with the obvious case? If so, that’s a real concern. Probably worth addressing more directly.
Unfortunately, I don’t feel qualified enough to write an article about this, nor to analyze the optimal form of gossip. I don’t think I have a solution. I just noticed a danger, and general unwillingness to debate it.
Probably the best thing I can do right now is to recommend good books on this topic. That would be:
The Mask of Sanity by Hervey M. Cleckley; specifically the 15 examples provided; and
People of the Lie by M. Scott Peck; this book is not scientific, but is much easier to read
I admit I do have some problems with moderating (specifically, the reddit database is pure horror, so it takes a lot of time to find anything), but my motivation for writing in this thread comes completely from offline life.
As a leader of my local rationalist community, I was wondering about the things that could happen if the community becomes greater and more successful. Like, if something bad happened within the community, I would feel personally responsible for the people I have invited there by visions of rationality and “winning”. (And “something bad” offline can be much worse than mere systematic downvoting.) Especially if we would achieve some kind of power in real life, which is what I hope to do one day. I want to do something better than just bring a lot of enthusiastic people to one place and let the fate decide. I trust myself not to start a cult, and not to abuse others, but that itself is no reason for others to trust me; and also, someone else may replace me (rather easily, since I am not good at coalition politics); or someone may do evil things under my roof, without me even noticing. Having a community of highly intelligent people has the risk that the possible sociopaths, if they come, will likely also be highly intelligent. So, I am thinking about what makes a community safe or unsafe. Because if the community grows large enough, sooner or later problems start happening. I would rather be prepared in advance. Trying to solve the problem ad-hoc would probably totally seem like a personal animosity or joining one faction in an internal conflict.
Can you express what you want to protect against while tabooing words like “bad”, “evil”, and “abuse”?
In the ideal world we could fully trust all people in our tribe to do nothing bad. Simply because we have known a people for years we could trust a person to do good.
That’s no rational heuristic. Our world is not structured in a way where the amount of time we know a person is a good heuristic for the amount of trust we can give that person.
There are a bunch of people I meet in the topic of personal development whom I trust very easily because I know the heuristics that those people use.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn’t have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
But if you use that as a criteria for kicking people out you people won’t be open about their own beliefs anymore.
In general trusting people a lot who tick half of the criterias that constitute clinical psychopathy isn’t a good idea.
On the other hand LW is per default inclusive and not structured in a way where it’s a good idea to kick out people on such a basis.
Intelligent sociopaths generally don’t go around telling people that they’re sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of. I have heard people saying similar things before, but they’ve generally been confused teenagers, Internet Tough Guys, and a few people who’re just really bad at recognizing their own emotions—who also aren’t the best people to trust, granted, but for different reasons.
I’d be more worried about people who habitually underestimate the empathy of others and don’t have obviously poor self-image or other issues to explain it. Most of the sociopaths I’ve met have had a habit of assuming those they interact with share, to some extent, their own lack of empathy: probably typical-mind fallacy in action.
The usually won’t say it in a way that the would predict will put other people on guard. On the other hand that doesn’t mean that they don’t say it at all.
I don’t find the link at the moment but a while ago someone posted on LW that he shouldn’t have trusted another person from a LW meetup who openly said those things and then acted like that.
Categorising Internet Tough Guys is hard. Base rates for psychopathy aren’t that low but you are right that not everyone who says those things is a psychopath. Even that it’s a signal for not giving full trust to that person.
(a) What exactly is the problem? I don’t really see a sociopath getting enough power in the community to take over LW as a realistic scenario.
(b) What kind of possible solutions do you think exist?
What do you mean by “harm”. I have to ask because there is a movement (commonly called SJW) pushing an insanely broad definition of “harm”. For example, if you’ve shattered someone’s worldview have you “harmed” him?
Not per se, although there could be some harm in the execution. For example if I decide to follow someone every day from their work screaming at them “Jesus is not real”, the problem is with me following them every day, not with the message. Or, if they are at a funeral of their mother and the priest is saying “let’s hope we will meet our beloved Jane in heaven with Jesus”, that would not be a proper moment to jump and scream “Jesus is not real”.