To coordinate we need a leader that many of us would sacrifice for. The obvious candidates are Eliezer Yudkowsky, Peter Thiel, and Scott Alexander. Perhaps we should develop a process by which a legitimate, high-quality leader could be chosen.
Edit: I see mankind as walking towards a minefield. We are almost certainly not in the minefield yet, at our current rate we will almost certainly hit the minefield this century, lots of people don’t think the minefield exists or think that fate or God will protect us from the minefield, and competitive pressures (Moloch) make lots of people individually better off if they push us a bit faster towards this minefield.
I disagree. The LW community already has capable high-status people who many others in the community look up to and listen to suggestions from. It’s not clear to me what the benefit is from picking a single leader. I’m not sure what kinds of coordination problems you had in mind, but I’d expect that most such problems that could be solved by a leader issuing a decree could also be solved by high-status figures coordinating with each other on how to encourage others to coordinate. High-status people and organizations in the LW community communicate with each other a fair amount, so they should be able to do that.
And there are significant costs to picking a leader. It creates a single point of failure, making the leader’s mistakes more costly, and inhibiting innovation in leadership style. It also creates PR problems; in fact, LW already has faced PR problems regarding being an Eliezer Yudkowsky personality cult.
Also, if we were to pick a leader, Peter Thiel strikes me as an exceptionally terrible choice.
The ten up-votes you have for this post is a signal that either we shouldn’t have a leader or if we should it would be difficult for him/her to overcome the opposition in the rationality movement to having a leader.
Speaking for myself (one of the upvotes), I think that having a single leader is bad, but having a relatively small group of leaders is good.
With one leader, it means anything they do or say (or did or said years or decades ago) becomes interpreted as “this is what the whole rationalist community is about”. Also, I feel like focusing on one person too much could make others feel like followers, instead of striving to become stronger.
But if we have a small team of people who are highly respected by the community, and publicly acknowledge each other, and can cooperate with each other… then all we need for coordination is if they meet in the same room once in a while, and publish a common statement afterwards.
I don’t want to choose between Eliezer Yudkowsky, Peter Thiel, and Scott Alexander (and other possible candidates, e.g. Anna Salamon and Julia Galef). Each of these people is really impressive in some areas, but neither is impressive at everything. Choosing one of them feels like deciding which aspects we should sacrifice. Also, some competition is good, and a person who is great today may become less great tomorrow.
Or maybe the leader does not have to be great at everything, as long as they are great at “being a great rationalist leader”, whatever that means. But maybe we actually don’t have this kind of a person yet. (Weak evidence: if a person with such skills would exist, the person would probably already be informally accepted as the leader of rationalists. They wouldn’t wait until a comment on LW tells them to step forward.) Peter Thiel doesn’t seem to communicate with the rationalist community. Eliezer Yudkowsky is hiding on facebook. Scott Alexander has an unrelated full-time job. Maybe none of them actually has enough time and energy to do the job of the “rationalist leader”, whatever that might be.
Also, I feel like asking for a “leader” is the instinctive, un-narrow, halo-effect approach typically generated by the corrupted human hardware. What specific problem are we trying to solve? Lack of communication and coordination in the rationalist community? I suggest Community Coordinator as a job title, and it doesn’t have to be any of these high-status people, as long as it is a person with good people skills and cooperates with them (uhm, maybe Cat Lavigne?). Maybe even a Media Speaker who would once in a week or once in a month collect information about “what’s new in the rationalist community”, and compose an official article.
tl;dr—we don’t need a “leader”, but we need people who will do a few specific things which are missing; coordination of the community being one of them
Part of the advantage of having a leader is that he/she could specialize in leading us and we could pay him/her a full-time salary. “Also, I feel like asking for a “leader” is the instinctive, un-narrow, halo-effect approach typically generated by the corrupted human hardware.” Yes, but this is what works.
Please taboo “leading us”. What is the actual job description for the leader you imagine? What is the expected outcome of having such leader?
And, depeding on your previous answer, could we achieve a similar outcome by simply having a specialist for given task? I mean, even actual leaders employ specialists, so why not skip the middleman? (Or do you believe that the leader would be better at finding the specialists? That sounds almost like a job description… of a specialist.)
Or is the leader supposed to be a symbol? A speaker for the movement?
Or perhaps a person who chooses an arbitrary goal (a meaningful one, but ultimately it would be an arbitrary choice among a few meaningful candidates) under the assumption that if we all focus on one goal, we are more likely to achieve it than if everyone follows a different goal (i.e. a suboptimal choice is still much better than no choice)?
I want someone who could effectively give orders/strong suggestions saying “give to this cause”, “write to your congressman saying this”, “if you have this skill please do this”, “person A should help person B get this job”, “person C is toxic and should be excluded from our community”, “person D is fantastic, let’s recruit her to our community”, “everyone please read this and discuss”, “person E is great, everyone thank her”, “person F has made great contributions to our community but has suffered some recent bad news so let’s help her out”.
I agree that all of this could be useful in many situations.
I just suspect there may be no person fit for this role and willing to take it, and that choosing an unfit person could be harmful. Essentially, people who are sufficiently sane and uncontroversial, are probably not interested in this role, because they believe they have better things to do. Otherwise, they could have already taken it.
All it would need at the beginning would be to privately ask other “rationalist celebrities” whether they think that X is a good idea and whether they are willing to endorse it publicly, and if they say yes, post X in the Main with the list of celebrities who endorse it. If the same person would do this 5 times in the row, people would automatically start accepting them as the leader. Most wouldn’t notice if for the sixth time the endorsements from the other “rationalist celebrities” would be absent, as long as none of them opposes the post directly.
Also, if we were to pick a leader, Peter Thiel strikes me as an exceptionally terrible choice.
I agree we shouldn’t pick a leader, but I’m curious why you think this. He’s the only person on the list who’s actually got leadership experience (CEO of Paypal), and he did a pretty good job.
Leading a business and leading a social movement require different skill sets, and Peter Thiel is also the only person on the list who isn’t even part of the LW community. Bringing in someone only tangentially associated with a community as its leader doesn’t seem like a good idea.
The key to deciding if we need a leader is to look at historically similar situations and see if they benefited from having a leader. Given that we would very much like to influence government policy, Peter Thiel strikes me as the best possible choice if he would accept. I read somewhere that when Julius Caesar was going to attack Rome several Senators approached Pompey the Great, handed him a sword, and said “save Rome.” I seriously think we should try something like this with Thiel.
Given that we would very much like to influence government policy
How would the position of leader of the LW community help Peter Thiel do this? Also, Peter Thiel’s policy priorities seem to differ a fair amount from those of the average lesswronger, and I’d be pretty surprised if he agreed to change priorities substantially in order to fit with his role as LW leader.
Given that we would very much like to influence government policy
Is this actually a thing that we would want? It seems to me like this line of reasoning depends on a lot of assumptions that don’t seem all that shared.
(I do think that rationalists should coordinate more, but I don’t think rationalists executing the “just obey authority” action is likely to succeed. That seems like a recipe for losing a lot of people from the ‘rationalist’ label. I think there are other approaches that are better suited to the range of rationalist personalities, that still has enough tradition behind it for it to be likely to work; the main inspirations here are Norse þings and Quaker meetings.)
I read somewhere that when Julius Caesar was going to attack Rome several Senators approached Pompey the Great, handed him a sword, and said “save Rome.” I seriously think we should try something like this with Thiel.
At the moment Peter Thiel should spent all his available time at recruiting people for the Trump administration to fill those 4000 places that are opened. Asking him to spend any time elsewhere is likely not effective.
If Alyssa Vance is correct that the community is bottlenecked on idea generation, I think this is exactly the wrong way to respond. My current view is that increasing hierarchy has the advantage of helping people coordinate better, but it has the disadvantage that people are less creative in a hierarchical context. Isaac Asimov on brainstorming:
If a single individual present has a much greater reputation than the others, or is more articulate, or has a distinctly more commanding personality, he may well take over the conference and reduce the rest to little more than passive obedience. The individual may himself be extremely useful, but he might as well be put to work solo, for he is neutralizing the rest.
I believe this has already happened to the community through the quasi-deification of people like Eliezer, Scott, and Gwern. It’s odd, because I generally view the LW community as quite nontraditional. But when I look at academia, I get the impression that college professors are significantly closer in status to their students than our intellectual leadership.
The problem is compounded by the fact that Eliezer, Scott, and Gwern are not actually leaders. They’re high status, but they aren’t giving people orders. This leads to leadership vacuums.
My current guess is that we should work on idea generation at present, then transform into a more hierarchical community when it’s obvious what needs to be done. I don’t know what the best community structure for idea generation is, but I suspect the university model is a good one: have a selective admissions process, while keeping the culture egalitarian for people who are accepted. At least this approach is proven.
I shall preface by saying that I am neither a rationalist nor an aspiring rationalist. Instead, I would classify myself as a “rationality consumer”—I enjoy debating philosophy and reading good competence/insight porn. My life is good enough that I don’t anticipate much subjective value from optimizing my decisionmaking.
I don’t know how representative I am. But I think if you want to reach “people who have something to protect” you need to use different approaches from “people who like competence porn”, and I think while a site like LW can serve both groups we are to some extent running into issues where we may have a population that is largely the latter instead of the former—people admire Gwern, but who wants to be Gwern? Who wants to be like Eliezer or lukeprog? We may not want leaders, but we don’t even have heroes.
I think possibly what’s missing, and this is especially relevant in the case of CFAR, is a solid, empirical, visceral case for the benefit of putting the techniques into action. At the risk of being branded outreach, and at the very real risk of significantly skewing their post-workshop stats gathering, CFAR should possibly put more effort into documenting stories of success through applying the techniques. I think the main focus of research should be full System-1 integration, not just for the techniques themselves but also for CFAR’s advertisement. I believe it’s possible to do this responsibly if one combines it with transparency and System-2 relevant statistics. Contingent, of course, on CFAR delivering the proportionate value.
I realize that there is a chicken-and-egg problem here where for reasons of honesty, you want to use System-1-appealing techniques that only work if the case is solid, which is exactly the thing that System-1 is traditionally bad at! I’m not sure how to solve that, but I think it needs to be solved. To my intuition, rationality won’t take off until it’s value-positive for S1 as well as S2. If you have something to protect you can push against S1 in the short-term, but the default engagement must be one of playful ease if you want to capture people in a state of idle interest.
CFAR should possibly put more effort into documenting stories of success through applying the techniques.
They do put effort into this; I do wonder how communicable it is, though.
For example, at one point Anna described a series of people all saying something like “well, I don’t know if it had any relationship to the workshop, but I did X, Y, and Z” during followups that, across many followups, seemed obviously due to the workshop. But it might be a vague thing that’s easier to see when you’re actually doing the followups rather than communicating statistics about followups.
I shall preface by saying that I am neither a rationalist nor an aspiring rationalist. Instead, I would classify myself as a “rationality consumer”—I enjoy debating philosophy and reading good competence/insight porn. My life is good enough that I don’t anticipate much subjective value from optimizing my decisionmaking.
Thanks so much for saying this! Thinking about this distinction you made, I feel there may be actually four groups of LW readers, with different needs or expectations from the website:
“Science/Tech Fans”—want more articles about new scientific research and new technologies. “Has anyone recently discovered a new particle, or built a new machine? Give me a popular science article about it!”
“Competence/Insight Consumers”—want more articles about pop psychology theories and life hacks. They feel they are already doing great, and only want to improve small details. “What do you believe is the true source of human motivation, and how do you organize your to-do lists? But first, give me your credentials: are you a successful person?”
“Already Solving a Problem”—want feedback on their progress, and information speficially useful for them. Highly specific; two people in the same category working on completely different problems probably wouldn’t benefit too much from talking to each other. If they achieve critical mass, it would be best to make a subgroup for them (except that LW currently does not support creating subgroups).
“Not Started Yet”—inspired by the Sequences, they would like to optimize their lives and the universe, but… they are stuck in place, or advancing very very slowly. They hope for some good advice that would make something “click”, and help them leave the ground.
Maybe it’s poll time… what do you want to read about?
If anyone’s mind is in a place where they think they’d be more productive or helpful if they sacrificed themselves for a leader, then, with respect, I think the best thing they can do for protecting humanity’s future is to fix that problem in themselves.
The way people normally solve big problems is to have a leader people respect, follow, and are willing to sacrifice for. If there is something in rationalists that prevents us from accepting leadership then the barbarians will almost certainly beat us.
I see two unrelated sub-problems: one of prediction and one of coordination.
We already know that experts are better than layman, but differentiated groups perform better at prediction than experts. Thus, a decisions took by an aggregated prediction market will be better in terms of accuracy, but the problem here is that people in general do not coordinate well in a horizontal structure.
To coordinate we need a leader that many of us would sacrifice for. The obvious candidates are Eliezer Yudkowsky, Peter Thiel, and Scott Alexander. Perhaps we should develop a process by which a legitimate, high-quality leader could be chosen.
Edit: I see mankind as walking towards a minefield. We are almost certainly not in the minefield yet, at our current rate we will almost certainly hit the minefield this century, lots of people don’t think the minefield exists or think that fate or God will protect us from the minefield, and competitive pressures (Moloch) make lots of people individually better off if they push us a bit faster towards this minefield.
I disagree. The LW community already has capable high-status people who many others in the community look up to and listen to suggestions from. It’s not clear to me what the benefit is from picking a single leader. I’m not sure what kinds of coordination problems you had in mind, but I’d expect that most such problems that could be solved by a leader issuing a decree could also be solved by high-status figures coordinating with each other on how to encourage others to coordinate. High-status people and organizations in the LW community communicate with each other a fair amount, so they should be able to do that.
And there are significant costs to picking a leader. It creates a single point of failure, making the leader’s mistakes more costly, and inhibiting innovation in leadership style. It also creates PR problems; in fact, LW already has faced PR problems regarding being an Eliezer Yudkowsky personality cult.
Also, if we were to pick a leader, Peter Thiel strikes me as an exceptionally terrible choice.
The ten up-votes you have for this post is a signal that either we shouldn’t have a leader or if we should it would be difficult for him/her to overcome the opposition in the rationality movement to having a leader.
Speaking for myself (one of the upvotes), I think that having a single leader is bad, but having a relatively small group of leaders is good.
With one leader, it means anything they do or say (or did or said years or decades ago) becomes interpreted as “this is what the whole rationalist community is about”. Also, I feel like focusing on one person too much could make others feel like followers, instead of striving to become stronger.
But if we have a small team of people who are highly respected by the community, and publicly acknowledge each other, and can cooperate with each other… then all we need for coordination is if they meet in the same room once in a while, and publish a common statement afterwards.
I don’t want to choose between Eliezer Yudkowsky, Peter Thiel, and Scott Alexander (and other possible candidates, e.g. Anna Salamon and Julia Galef). Each of these people is really impressive in some areas, but neither is impressive at everything. Choosing one of them feels like deciding which aspects we should sacrifice. Also, some competition is good, and a person who is great today may become less great tomorrow.
Or maybe the leader does not have to be great at everything, as long as they are great at “being a great rationalist leader”, whatever that means. But maybe we actually don’t have this kind of a person yet. (Weak evidence: if a person with such skills would exist, the person would probably already be informally accepted as the leader of rationalists. They wouldn’t wait until a comment on LW tells them to step forward.) Peter Thiel doesn’t seem to communicate with the rationalist community. Eliezer Yudkowsky is hiding on facebook. Scott Alexander has an unrelated full-time job. Maybe none of them actually has enough time and energy to do the job of the “rationalist leader”, whatever that might be.
Also, I feel like asking for a “leader” is the instinctive, un-narrow, halo-effect approach typically generated by the corrupted human hardware. What specific problem are we trying to solve? Lack of communication and coordination in the rationalist community? I suggest Community Coordinator as a job title, and it doesn’t have to be any of these high-status people, as long as it is a person with good people skills and cooperates with them (uhm, maybe Cat Lavigne?). Maybe even a Media Speaker who would once in a week or once in a month collect information about “what’s new in the rationalist community”, and compose an official article.
tl;dr—we don’t need a “leader”, but we need people who will do a few specific things which are missing; coordination of the community being one of them
Part of the advantage of having a leader is that he/she could specialize in leading us and we could pay him/her a full-time salary. “Also, I feel like asking for a “leader” is the instinctive, un-narrow, halo-effect approach typically generated by the corrupted human hardware.” Yes, but this is what works.
Please taboo “leading us”. What is the actual job description for the leader you imagine? What is the expected outcome of having such leader?
And, depeding on your previous answer, could we achieve a similar outcome by simply having a specialist for given task? I mean, even actual leaders employ specialists, so why not skip the middleman? (Or do you believe that the leader would be better at finding the specialists? That sounds almost like a job description… of a specialist.)
Or is the leader supposed to be a symbol? A speaker for the movement?
Or perhaps a person who chooses an arbitrary goal (a meaningful one, but ultimately it would be an arbitrary choice among a few meaningful candidates) under the assumption that if we all focus on one goal, we are more likely to achieve it than if everyone follows a different goal (i.e. a suboptimal choice is still much better than no choice)?
I want someone who could effectively give orders/strong suggestions saying “give to this cause”, “write to your congressman saying this”, “if you have this skill please do this”, “person A should help person B get this job”, “person C is toxic and should be excluded from our community”, “person D is fantastic, let’s recruit her to our community”, “everyone please read this and discuss”, “person E is great, everyone thank her”, “person F has made great contributions to our community but has suffered some recent bad news so let’s help her out”.
I agree that all of this could be useful in many situations.
I just suspect there may be no person fit for this role and willing to take it, and that choosing an unfit person could be harmful. Essentially, people who are sufficiently sane and uncontroversial, are probably not interested in this role, because they believe they have better things to do. Otherwise, they could have already taken it.
All it would need at the beginning would be to privately ask other “rationalist celebrities” whether they think that X is a good idea and whether they are willing to endorse it publicly, and if they say yes, post X in the Main with the list of celebrities who endorse it. If the same person would do this 5 times in the row, people would automatically start accepting them as the leader. Most wouldn’t notice if for the sixth time the endorsements from the other “rationalist celebrities” would be absent, as long as none of them opposes the post directly.
Telling you what to think and what to do, of course. Without a Glorious Leader you would just wander around, lost and confused.
Who is that “we”?
I agree we shouldn’t pick a leader, but I’m curious why you think this. He’s the only person on the list who’s actually got leadership experience (CEO of Paypal), and he did a pretty good job.
Leading a business and leading a social movement require different skill sets, and Peter Thiel is also the only person on the list who isn’t even part of the LW community. Bringing in someone only tangentially associated with a community as its leader doesn’t seem like a good idea.
The key to deciding if we need a leader is to look at historically similar situations and see if they benefited from having a leader. Given that we would very much like to influence government policy, Peter Thiel strikes me as the best possible choice if he would accept. I read somewhere that when Julius Caesar was going to attack Rome several Senators approached Pompey the Great, handed him a sword, and said “save Rome.” I seriously think we should try something like this with Thiel.
How would the position of leader of the LW community help Peter Thiel do this? Also, Peter Thiel’s policy priorities seem to differ a fair amount from those of the average lesswronger, and I’d be pretty surprised if he agreed to change priorities substantially in order to fit with his role as LW leader.
Is this actually a thing that we would want? It seems to me like this line of reasoning depends on a lot of assumptions that don’t seem all that shared.
(I do think that rationalists should coordinate more, but I don’t think rationalists executing the “just obey authority” action is likely to succeed. That seems like a recipe for losing a lot of people from the ‘rationalist’ label. I think there are other approaches that are better suited to the range of rationalist personalities, that still has enough tradition behind it for it to be likely to work; the main inspirations here are Norse þings and Quaker meetings.)
At the moment Peter Thiel should spent all his available time at recruiting people for the Trump administration to fill those 4000 places that are opened. Asking him to spend any time elsewhere is likely not effective.
If I remember correctly, history records Caesar as having been relentlessly successful in that campaign?
If Alyssa Vance is correct that the community is bottlenecked on idea generation, I think this is exactly the wrong way to respond. My current view is that increasing hierarchy has the advantage of helping people coordinate better, but it has the disadvantage that people are less creative in a hierarchical context. Isaac Asimov on brainstorming:
I believe this has already happened to the community through the quasi-deification of people like Eliezer, Scott, and Gwern. It’s odd, because I generally view the LW community as quite nontraditional. But when I look at academia, I get the impression that college professors are significantly closer in status to their students than our intellectual leadership.
This is my steelman of people who say LW is a cult. It’s not a cult, but large status differences might be a sociological “code smell” for intellectual communities. Think of the professor who insists that they always be addressed as “Dr. Jones” instead of being called by their first name. This is rarely the sort of earnest, energetic, independent-minded person who makes important discoveries. “The people I know who do great work think that they suck, but that everyone else sucks even more.”
The problem is compounded by the fact that Eliezer, Scott, and Gwern are not actually leaders. They’re high status, but they aren’t giving people orders. This leads to leadership vacuums.
My current guess is that we should work on idea generation at present, then transform into a more hierarchical community when it’s obvious what needs to be done. I don’t know what the best community structure for idea generation is, but I suspect the university model is a good one: have a selective admissions process, while keeping the culture egalitarian for people who are accepted. At least this approach is proven.
I shall preface by saying that I am neither a rationalist nor an aspiring rationalist. Instead, I would classify myself as a “rationality consumer”—I enjoy debating philosophy and reading good competence/insight porn. My life is good enough that I don’t anticipate much subjective value from optimizing my decisionmaking.
I don’t know how representative I am. But I think if you want to reach “people who have something to protect” you need to use different approaches from “people who like competence porn”, and I think while a site like LW can serve both groups we are to some extent running into issues where we may have a population that is largely the latter instead of the former—people admire Gwern, but who wants to be Gwern? Who wants to be like Eliezer or lukeprog? We may not want leaders, but we don’t even have heroes.
I think possibly what’s missing, and this is especially relevant in the case of CFAR, is a solid, empirical, visceral case for the benefit of putting the techniques into action. At the risk of being branded outreach, and at the very real risk of significantly skewing their post-workshop stats gathering, CFAR should possibly put more effort into documenting stories of success through applying the techniques. I think the main focus of research should be full System-1 integration, not just for the techniques themselves but also for CFAR’s advertisement. I believe it’s possible to do this responsibly if one combines it with transparency and System-2 relevant statistics. Contingent, of course, on CFAR delivering the proportionate value.
I realize that there is a chicken-and-egg problem here where for reasons of honesty, you want to use System-1-appealing techniques that only work if the case is solid, which is exactly the thing that System-1 is traditionally bad at! I’m not sure how to solve that, but I think it needs to be solved. To my intuition, rationality won’t take off until it’s value-positive for S1 as well as S2. If you have something to protect you can push against S1 in the short-term, but the default engagement must be one of playful ease if you want to capture people in a state of idle interest.
They do put effort into this; I do wonder how communicable it is, though.
For example, at one point Anna described a series of people all saying something like “well, I don’t know if it had any relationship to the workshop, but I did X, Y, and Z” during followups that, across many followups, seemed obviously due to the workshop. But it might be a vague thing that’s easier to see when you’re actually doing the followups rather than communicating statistics about followups.
Thanks so much for saying this! Thinking about this distinction you made, I feel there may be actually four groups of LW readers, with different needs or expectations from the website:
“Science/Tech Fans”—want more articles about new scientific research and new technologies. “Has anyone recently discovered a new particle, or built a new machine? Give me a popular science article about it!”
“Competence/Insight Consumers”—want more articles about pop psychology theories and life hacks. They feel they are already doing great, and only want to improve small details. “What do you believe is the true source of human motivation, and how do you organize your to-do lists? But first, give me your credentials: are you a successful person?”
“Already Solving a Problem”—want feedback on their progress, and information speficially useful for them. Highly specific; two people in the same category working on completely different problems probably wouldn’t benefit too much from talking to each other. If they achieve critical mass, it would be best to make a subgroup for them (except that LW currently does not support creating subgroups).
“Not Started Yet”—inspired by the Sequences, they would like to optimize their lives and the universe, but… they are stuck in place, or advancing very very slowly. They hope for some good advice that would make something “click”, and help them leave the ground.
Maybe it’s poll time… what do you want to read about?
[pollid:1176]
If anyone’s mind is in a place where they think they’d be more productive or helpful if they sacrificed themselves for a leader, then, with respect, I think the best thing they can do for protecting humanity’s future is to fix that problem in themselves.
The way people normally solve big problems is to have a leader people respect, follow, and are willing to sacrifice for. If there is something in rationalists that prevents us from accepting leadership then the barbarians will almost certainly beat us.
I see two unrelated sub-problems: one of prediction and one of coordination.
We already know that experts are better than layman, but differentiated groups perform better at prediction than experts. Thus, a decisions took by an aggregated prediction market will be better in terms of accuracy, but the problem here is that people in general do not coordinate well in a horizontal structure.