Unfortunately, I don’t feel qualified enough to write an article about this, nor to analyze the optimal form of gossip. I don’t think I have a solution. I just noticed a danger, and general unwillingness to debate it.
Probably the best thing I can do right now is to recommend good books on this topic. That would be:
The Mask of Sanity by Hervey M. Cleckley; specifically the 15 examples provided; and
People of the Lie by M. Scott Peck; this book is not scientific, but is much easier to read
I admit I do have some problems with moderating (specifically, the reddit database is pure horror, so it takes a lot of time to find anything), but my motivation for writing in this thread comes completely from offline life.
As a leader of my local rationalist community, I was wondering about the things that could happen if the community becomes greater and more successful. Like, if something bad happened within the community, I would feel personally responsible for the people I have invited there by visions of rationality and “winning”. (And “something bad” offline can be much worse than mere systematic downvoting.) Especially if we would achieve some kind of power in real life, which is what I hope to do one day. I want to do something better than just bring a lot of enthusiastic people to one place and let the fate decide. I trust myself not to start a cult, and not to abuse others, but that itself is no reason for others to trust me; and also, someone else may replace me (rather easily, since I am not good at coalition politics); or someone may do evil things under my roof, without me even noticing. Having a community of highly intelligent people has the risk that the possible sociopaths, if they come, will likely also be highly intelligent. So, I am thinking about what makes a community safe or unsafe. Because if the community grows large enough, sooner or later problems start happening. I would rather be prepared in advance. Trying to solve the problem ad-hoc would probably totally seem like a personal animosity or joining one faction in an internal conflict.
In the ideal world we could fully trust all people in our tribe to do nothing bad. Simply because we have known a people for years we could trust a person to do good.
That’s no rational heuristic. Our world is not structured in a way where the amount of time we know a person is a good heuristic for the amount of trust we can give that person.
There are a bunch of people I meet in the topic of personal development whom I trust very easily because I know the heuristics that those people use.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn’t have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
But if you use that as a criteria for kicking people out you people won’t be open about their own beliefs anymore.
In general trusting people a lot who tick half of the criterias that constitute clinical psychopathy isn’t a good idea.
On the other hand LW is per default inclusive and not structured in a way where it’s a good idea to kick out people on such a basis.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn’t have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
Intelligent sociopaths generally don’t go around telling people that they’re sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of. I have heard people saying similar things before, but they’ve generally been confused teenagers, Internet Tough Guys, and a few people who’re just really bad at recognizing their own emotions—who also aren’t the best people to trust, granted, but for different reasons.
I’d be more worried about people who habitually underestimate the empathy of others and don’t have obviously poor self-image or other issues to explain it. Most of the sociopaths I’ve met have had a habit of assuming those they interact with share, to some extent, their own lack of empathy: probably typical-mind fallacy in action.
Intelligent sociopaths generally don’t go around telling people that they’re sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of.
The usually won’t say it in a way that the would predict will put other people on guard. On the other hand that doesn’t mean that they don’t say it at all.
I don’t find the link at the moment but a while ago someone posted on LW that he shouldn’t have trusted another person from a LW meetup who openly said those things and then acted like that.
Categorising Internet Tough Guys is hard. Base rates for psychopathy aren’t that low but you are right that not everyone who says those things is a psychopath.
Even that it’s a signal for not giving full trust to that person.
Unfortunately, I don’t feel qualified enough to write an article about this, nor to analyze the optimal form of gossip. I don’t think I have a solution. I just noticed a danger, and general unwillingness to debate it.
Probably the best thing I can do right now is to recommend good books on this topic. That would be:
The Mask of Sanity by Hervey M. Cleckley; specifically the 15 examples provided; and
People of the Lie by M. Scott Peck; this book is not scientific, but is much easier to read
I admit I do have some problems with moderating (specifically, the reddit database is pure horror, so it takes a lot of time to find anything), but my motivation for writing in this thread comes completely from offline life.
As a leader of my local rationalist community, I was wondering about the things that could happen if the community becomes greater and more successful. Like, if something bad happened within the community, I would feel personally responsible for the people I have invited there by visions of rationality and “winning”. (And “something bad” offline can be much worse than mere systematic downvoting.) Especially if we would achieve some kind of power in real life, which is what I hope to do one day. I want to do something better than just bring a lot of enthusiastic people to one place and let the fate decide. I trust myself not to start a cult, and not to abuse others, but that itself is no reason for others to trust me; and also, someone else may replace me (rather easily, since I am not good at coalition politics); or someone may do evil things under my roof, without me even noticing. Having a community of highly intelligent people has the risk that the possible sociopaths, if they come, will likely also be highly intelligent. So, I am thinking about what makes a community safe or unsafe. Because if the community grows large enough, sooner or later problems start happening. I would rather be prepared in advance. Trying to solve the problem ad-hoc would probably totally seem like a personal animosity or joining one faction in an internal conflict.
Can you express what you want to protect against while tabooing words like “bad”, “evil”, and “abuse”?
In the ideal world we could fully trust all people in our tribe to do nothing bad. Simply because we have known a people for years we could trust a person to do good.
That’s no rational heuristic. Our world is not structured in a way where the amount of time we know a person is a good heuristic for the amount of trust we can give that person.
There are a bunch of people I meet in the topic of personal development whom I trust very easily because I know the heuristics that those people use.
If you have someone in your local LW group who tells you that his utility function is that he maximizes his own utility and who doesn’t have empathy that would make him feel bad when he abuses others, the rational thing is to not trust that person very much.
But if you use that as a criteria for kicking people out you people won’t be open about their own beliefs anymore.
In general trusting people a lot who tick half of the criterias that constitute clinical psychopathy isn’t a good idea.
On the other hand LW is per default inclusive and not structured in a way where it’s a good idea to kick out people on such a basis.
Intelligent sociopaths generally don’t go around telling people that they’re sociopaths (or words to that effect), because that would put others on their guard and make them harder to get things out of. I have heard people saying similar things before, but they’ve generally been confused teenagers, Internet Tough Guys, and a few people who’re just really bad at recognizing their own emotions—who also aren’t the best people to trust, granted, but for different reasons.
I’d be more worried about people who habitually underestimate the empathy of others and don’t have obviously poor self-image or other issues to explain it. Most of the sociopaths I’ve met have had a habit of assuming those they interact with share, to some extent, their own lack of empathy: probably typical-mind fallacy in action.
The usually won’t say it in a way that the would predict will put other people on guard. On the other hand that doesn’t mean that they don’t say it at all.
I don’t find the link at the moment but a while ago someone posted on LW that he shouldn’t have trusted another person from a LW meetup who openly said those things and then acted like that.
Categorising Internet Tough Guys is hard. Base rates for psychopathy aren’t that low but you are right that not everyone who says those things is a psychopath. Even that it’s a signal for not giving full trust to that person.