It sounds to me like half of the perceived public image problem comes from apparently blurred lines between the SIAI and LessWrong, and between the SIAI and Eliezer himself. These could be real problems—I generally have difficulty explaining one of the three without mentioning the other two—but I’m not sure how significant it is.
The ideal situation would be that people would evaluate SIAI based on its publications, the justification of the research areas, and whether the current and proposed projects satisfy those goals best, are reasonably costed, and are making progress.
Whoever actually holds these as the points to be evaluated will find the list of achievements. Individual projects all have detailed proposals and a budget breakdown, since donors can choose to donate directly to one research project or another.
Finally, a large number of those projects are academic papers. If you dig a bit, you’ll find that many of these papers are submitted at academic and industry conferences. Hosting the Singularity Summit doesn’t hurt either.
It doesn’t make sense to downplay a researcher’s strange viewpoints if those viewpoints seem valid. Eliezer believes his viewpoint to be valid. LessWrong, a project of his, has a lot of people who agree with his ideas. There are also people who disagree with some of his ideas, but the point is that it shouldn’t matter. LessWrong is a project of SIAI, not the organization itself. Support on this website of his ideas should have little to do with SIAI’s support of his ideas.
Your points seem to be that claims made by Eliezer and upheld by the SIAI don’t appear credible due to insufficient argument, and due to one person’s personality. You can argue all you want about how he is viewed. You can debate the published papers’ worth. But the two shouldn’t be equated. This despite the fact that he’s written half of the publications.
Here are the questions (that tie to your post) which I think are worth discussing on public relations, if not the contents of the publications:
Do people equate “The views of Eliezer Yudkowsky” with “The views of SIAI”? Do people view the research program or organization as “his” project?
Which people, and to what extent?
Is this good or bad, and how important is it?
The optimal answer to those questions is the one that leads the most AI researchers to evaluate the most publications with the respect of serious scrutiny and consideration.
I’ll repeat that other people have published papers with the SIAI, that their proposals are spelled out, that some papers are presented at academic and industry conferences, and that the SIAI’s Singularity Summit hosts speakers who do not agree with all of Eliezer’s opinions, who nonetheless associate with the organization by attendance.
To top it off, the SIAI is responsible for getting James Randi’s seal of approval on the Singularity being probable. That’s not poisoning the meme, not one bit.
I feel it’s worth pointing out that just because something should be, doesn’t mean it is. You state:
Your points seem to be that claims made by Eliezer and upheld by the SIAI don’t appear credible due to insufficient argument, and due to one person’s personality. You can argue all you want about how he is viewed. You can debate the published papers’ worth. But the two shouldn’t be equated.
I agree with the sentiment, but how practical is it? Just because it would be incorrect to equate Eliezer and the SIAI doesn’t meant that people won’t do it. Perhaps it would be reasonable to say that the people who fail to make the distinction are also the people on whom it’s not worth expending the effort trying to explicate the situation, but I suspect that it is still the case that the majority of people are going to have a hard time not making that equation if they even try at all.
The purpose of this article, I would presume to say, is that public relations actually does serve a valid and useful purpose. It is not a wasted effort to ensure that the ideas that one considers true, or at least worthwhile, are presented in the sort of light that encourages people to take them seriously. This is something that I think many people of a more intellectual bent often fail to consider; though some of us might actually invest time and effort into determining for ourselves whether an idea is good or not, I would say the majority do not and instead rely on trusted sources to guide them (with often disastrous results).
Again, it may just be that we don’t care about those people (and it’s certainly tempting to go that way), but there may be times when quantity of supporters, in addition to quality, could be useful.
We don’t disagree on any point that I can see. I was contrasting an ideal way of looking at things (part of what you quoted) from how people might actually see things (my three bullet-point questions).
As much as I enjoy Eliezer’s thoughts and respect his work, I’m also of the opinion that one of the tasks the SIAI must work on (and almost certainly is working on) is keeping his research going while making the distinction between the two entities more obvious. But to whom? The research community should be the first and primary target.
Coming back from the Summit, I feel that they’re taking decent measures toward this. The most important thing to do is for the other SIAI names to be known. Michael Vassar’s is the easiest to get people to hold because of the name of his role, and he was acting as the SIAI face more than Eliezer was. At this point, a dispute would make the SIAI look unstable—they need positive promotion of leadership and idea diversity, more public awareness of their interactions with academia, and that’s about it.
Housing a clearly promoted second research program would solve this problem. If only there was enough money, and a second goal which didn’t obviously conflict with the first, and the program still fit under the mission statement. I don’t know if that is possible. Money aside, I think that it is possible. Decision theoretic research with respect to FAI is just one area of FAI research. Utterly essential, but probably not all there is to do.
To top it off, the SIAI is responsible for getting James Randi’s seal of approval on the Singularity being probable. That’s not poisoning the meme, not one bit.
It sounds to me like half of the perceived public image problem comes from apparently blurred lines between the SIAI and LessWrong, and between the SIAI and Eliezer himself. These could be real problems—I generally have difficulty explaining one of the three without mentioning the other two—but I’m not sure how significant it is.
The ideal situation would be that people would evaluate SIAI based on its publications, the justification of the research areas, and whether the current and proposed projects satisfy those goals best, are reasonably costed, and are making progress.
Whoever actually holds these as the points to be evaluated will find the list of achievements. Individual projects all have detailed proposals and a budget breakdown, since donors can choose to donate directly to one research project or another.
Finally, a large number of those projects are academic papers. If you dig a bit, you’ll find that many of these papers are submitted at academic and industry conferences. Hosting the Singularity Summit doesn’t hurt either.
It doesn’t make sense to downplay a researcher’s strange viewpoints if those viewpoints seem valid. Eliezer believes his viewpoint to be valid. LessWrong, a project of his, has a lot of people who agree with his ideas. There are also people who disagree with some of his ideas, but the point is that it shouldn’t matter. LessWrong is a project of SIAI, not the organization itself. Support on this website of his ideas should have little to do with SIAI’s support of his ideas.
Your points seem to be that claims made by Eliezer and upheld by the SIAI don’t appear credible due to insufficient argument, and due to one person’s personality. You can argue all you want about how he is viewed. You can debate the published papers’ worth. But the two shouldn’t be equated. This despite the fact that he’s written half of the publications.
Here are the questions (that tie to your post) which I think are worth discussing on public relations, if not the contents of the publications:
Do people equate “The views of Eliezer Yudkowsky” with “The views of SIAI”? Do people view the research program or organization as “his” project?
Which people, and to what extent?
Is this good or bad, and how important is it?
The optimal answer to those questions is the one that leads the most AI researchers to evaluate the most publications with the respect of serious scrutiny and consideration.
I’ll repeat that other people have published papers with the SIAI, that their proposals are spelled out, that some papers are presented at academic and industry conferences, and that the SIAI’s Singularity Summit hosts speakers who do not agree with all of Eliezer’s opinions, who nonetheless associate with the organization by attendance.
To top it off, the SIAI is responsible for getting James Randi’s seal of approval on the Singularity being probable. That’s not poisoning the meme, not one bit.
I feel it’s worth pointing out that just because something should be, doesn’t mean it is. You state:
I agree with the sentiment, but how practical is it? Just because it would be incorrect to equate Eliezer and the SIAI doesn’t meant that people won’t do it. Perhaps it would be reasonable to say that the people who fail to make the distinction are also the people on whom it’s not worth expending the effort trying to explicate the situation, but I suspect that it is still the case that the majority of people are going to have a hard time not making that equation if they even try at all.
The purpose of this article, I would presume to say, is that public relations actually does serve a valid and useful purpose. It is not a wasted effort to ensure that the ideas that one considers true, or at least worthwhile, are presented in the sort of light that encourages people to take them seriously. This is something that I think many people of a more intellectual bent often fail to consider; though some of us might actually invest time and effort into determining for ourselves whether an idea is good or not, I would say the majority do not and instead rely on trusted sources to guide them (with often disastrous results).
Again, it may just be that we don’t care about those people (and it’s certainly tempting to go that way), but there may be times when quantity of supporters, in addition to quality, could be useful.
We don’t disagree on any point that I can see. I was contrasting an ideal way of looking at things (part of what you quoted) from how people might actually see things (my three bullet-point questions).
As much as I enjoy Eliezer’s thoughts and respect his work, I’m also of the opinion that one of the tasks the SIAI must work on (and almost certainly is working on) is keeping his research going while making the distinction between the two entities more obvious. But to whom? The research community should be the first and primary target.
Coming back from the Summit, I feel that they’re taking decent measures toward this. The most important thing to do is for the other SIAI names to be known. Michael Vassar’s is the easiest to get people to hold because of the name of his role, and he was acting as the SIAI face more than Eliezer was. At this point, a dispute would make the SIAI look unstable—they need positive promotion of leadership and idea diversity, more public awareness of their interactions with academia, and that’s about it.
Housing a clearly promoted second research program would solve this problem. If only there was enough money, and a second goal which didn’t obviously conflict with the first, and the program still fit under the mission statement. I don’t know if that is possible. Money aside, I think that it is possible. Decision theoretic research with respect to FAI is just one area of FAI research. Utterly essential, but probably not all there is to do.
To top it off, the SIAI is responsible for getting James Randi’s seal of approval on the Singularity being probable. That’s not poisoning the meme, not one bit.