It’s Luke you should have fallen in love with, since he is the one turning things around.
On the other hand I can count with one hand the number of established organisations I know of that would be sociologically capable of ceding power, status and control to Luke the way SingInst did. They took an untrained intern with essentially zero external status from past achievements and affiliations and basically decided to let him run the show (at least in terms of publicly visible initiatives). It is clearly the right thing for SingInst to do and admittedly Luke is very tall and has good hair which generally gives a boost when it comes to such selections—but still, making the appointment goes fundamentally against normal human behavior.
(Where I say “count with one hand” I am not including the use of any digits thereupon. I mean one.)
As a minor note, observe that claims of extraordinary rationality do not necessarily contradict claims of irrationality. The sanity waterline is very low.
Do you mean to imply in context here that the organizational management of SIAI at the time under discussion was above average for a nonprofit organization? Or are you just making a more general statement that a system can be irrational while demonstrating above average rationality? I certainly agree with the latter.
Are you comparing it to the average among nonprofits started, or nonprofits extant? I would guess that it was well below average for extant nonprofits, but about or slightly above average for started nonprofits. I’d guess that most nonprofits are started by people who don’t know what they’re doing and don’t know what they don’t know, and that SI probably did slightly better because the people who were being a bit stupid were at least very smart, which can help. However, I’d guess that most such nonprofits don’t live long because they don’t find a Peter Thiel to keep them alive.
Your assessment looks about right to me. I have considerable experience of averagely-incompetent nonprofits, and SIAI looks normal to me. I am strongly tempted to grab that “For Dummies” book and, if it’s good, start sending copies to people …
I don’t see what’s the point to comparing to average nonprofits. Average for-profits don’t realize any profit, and average non-profits just waste money.
I would say SIAI is best paralleled to average started ‘research’ organization that is developing some free energy something, run by non-scientists, with some hired scientists as chaff.
Sadly, I agree. Unless you look at it very closely, SIAI pattern-matches to “crackpots trying to raise money to fund their crackpottiness” fairly well. (What saves them is that their ideas are a lot better than the average crackpot.)
Or are you just making a more general statement that a system can be irrational while demonstrating above average rationality?
Yes, this.
On an arbitrary scale I just made up, below 100 degrees of rationality is “irrational”, and 0 degrees of rationality is “ordinary”. 50 is extraordinarily rational and yet irrational.
50 while you’re thinking you’re at 100 is being an extraordinary loser (overconfidence leads to big failures)
In any case this is just word play. Holden seen many organizations that are/were more rational, that’s probably what he means by lack of extraordinary rationality.
Just to let you know, you’ve just made it on my list of the very few LW regulars I no longer bother replying to, due to the proven futility of any communications. In your case it is because you have a very evident ax to grind, which is incompatible with rational thought.
This comment seems strange. Is having an ax to grind opposed to rationality? Then why does Eliezer Yudkowsky, for example, not hesitate to advocate for causes such as friendly AI? Doesn’t he have an ax to grind? More of one really, since this ax chops trees of gold.
It would seem intellectual honesty would require that you say you reject discussions with people with an ax to grind, unless you grind a similar ax.
From http://www.usingenglish.com: “If you have an axe to grind with someone or about something, you have a grievance, a resentment and you want to get revenge or sort it out.” One can hardly call the unacknowledged emotions of resentment and needing a revenge/retribution compatible with rationality. srdiamond piled a bunch of (partially correct but irrelevant in the context of my comment) negative statements about SI, making these emotions quite clear.
That’s a restrictive definition of “ax to grind,” by the way—it’s normally used to mean any special interest in the subject: “an ulterior often selfish underlying purpose ” (Merriam-Webster’s Collegiate Dictionary)
But I might as well accept your meaning for discussion purposes. If you detect unacknowledged resentment in srdiamond, don’t you detect unacknowledged ambition in Eliezer Yudkowsky?
There’s actually good reason for the broader meaning of “ax to grind.” Any special stake is a bias. I don’t think you can say that someone who you think acts out of resentment, like srdiamond, is more intractably biased than someone who acts out of other forms of narrow self-interest, which almost invariably applies when someone defends something he gets money from.
I don’t think it’s a rational method to treat people differently, as inherently less rational, when they seem resentful. It is only one of many difficult biases. Financial interest is probably more biasing. If you think the arguments are crummy, that’s something else. But the motive—resentment or finances—should probably have little bearing on how a message is treated in serious discussion.
The impression I get from scanning their comment history is that metaphysicist means to suggest here that EY has ambitions he hasn’t acknowledged (e.g., the ambition to make money without conventional credentials), not that he fails to acknowledge any of the ambitions he has.
I don’t think it’s a rational method to treat people differently, as inherently less rational, when they seem resentful.
Thank you for this analysis, it made me think more about my motivations and their validity. I believe that my decision to permanently disengage from discussions with some people is based on the futility of such discussions in the past, not on the specific reasons they are futile. At some point I simply decide to cut my losses.
There’s actually good reason for the broader meaning of “ax to grind.” Any special stake is a bias.
Indeed, present company not excluded. The question is whether it permanently prevents the ax-grinder from listening. EY, too, has his share of unacknowledged irrationalities, but both his status and his ability to listen and to provide insights makes engaging him in a discussion a rewarding, if sometimes frustrating experience.
I don’t not know why srdiamond’s need to bash SI is so entrenched, and whether it can be remedied to a degree where he is once again worth talking to, so at this point it is instrumentally rational for me to avoid replying to him.
Well, all we really know is that he chose to. It may be that everyone he works with then privately berated him for it. That said, I share your sentiment. Actually, if SI generally endorses this sort of public “airing of dirty laundry,” I encourage others involved in the organization to say so out loud.
You’re allowed to say these things on the public Internet?
I just fell in love with SI.
Well, at our most recent board meeting I wasn’t fired, reprimanded, or even questioned for making these comments, so I guess I am. :)
Not even funny looks? ;)
It’s Luke you should have fallen in love with, since he is the one turning things around.
On the other hand I can count with one hand the number of established organisations I know of that would be sociologically capable of ceding power, status and control to Luke the way SingInst did. They took an untrained intern with essentially zero external status from past achievements and affiliations and basically decided to let him run the show (at least in terms of publicly visible initiatives). It is clearly the right thing for SingInst to do and admittedly Luke is very tall and has good hair which generally gives a boost when it comes to such selections—but still, making the appointment goes fundamentally against normal human behavior.
(Where I say “count with one hand” I am not including the use of any digits thereupon. I mean one.)
It doesn’t matter that I completely understand why this phrase was included, I still found it hilarious in a network sitcom sort of way.
Consider the implications in light of the HoldenKarnofsky’s critique about SI pretensions to high rationality.
Rationality is winning.
SI, at the same time as it was claiming extraordinary rationality, was behaving in ways that were blatantly irrational.
Although this is supposedly due to “the usual causes,” rationality (winning) subsumes overcoming akrasia.
HoldenKarnofsky is correct that SI made claims for its own extraordinary rationality at a time when its leaders weren’t rational.
Further: why should anyone give SI credibility today—when it stands convicted of self-serving misrepresentation in the recent past?
As a minor note, observe that claims of extraordinary rationality do not necessarily contradict claims of irrationality. The sanity waterline is very low.
Do you mean to imply in context here that the organizational management of SIAI at the time under discussion was above average for a nonprofit organization? Or are you just making a more general statement that a system can be irrational while demonstrating above average rationality? I certainly agree with the latter.
Are you comparing it to the average among nonprofits started, or nonprofits extant? I would guess that it was well below average for extant nonprofits, but about or slightly above average for started nonprofits. I’d guess that most nonprofits are started by people who don’t know what they’re doing and don’t know what they don’t know, and that SI probably did slightly better because the people who were being a bit stupid were at least very smart, which can help. However, I’d guess that most such nonprofits don’t live long because they don’t find a Peter Thiel to keep them alive.
Your assessment looks about right to me. I have considerable experience of averagely-incompetent nonprofits, and SIAI looks normal to me. I am strongly tempted to grab that “For Dummies” book and, if it’s good, start sending copies to people …
In the context of thomblake’s comment, I suppose nonprofits started is the proper reference class.
I don’t see what’s the point to comparing to average nonprofits. Average for-profits don’t realize any profit, and average non-profits just waste money.
I would say SIAI is best paralleled to average started ‘research’ organization that is developing some free energy something, run by non-scientists, with some hired scientists as chaff.
Sadly, I agree. Unless you look at it very closely, SIAI pattern-matches to “crackpots trying to raise money to fund their crackpottiness” fairly well. (What saves them is that their ideas are a lot better than the average crackpot.)
Yes, this.
On an arbitrary scale I just made up, below 100 degrees of rationality is “irrational”, and 0 degrees of rationality is “ordinary”. 50 is extraordinarily rational and yet irrational.
50 while you’re thinking you’re at 100 is being an extraordinary loser (overconfidence leads to big failures)
In any case this is just word play. Holden seen many organizations that are/were more rational, that’s probably what he means by lack of extraordinary rationality.
You’ve misread the post—Luke is saying that he doesn’t think the “usual defeaters” are the most likely explanation.
Correct.
Just to let you know, you’ve just made it on my list of the very few LW regulars I no longer bother replying to, due to the proven futility of any communications. In your case it is because you have a very evident ax to grind, which is incompatible with rational thought.
This comment seems strange. Is having an ax to grind opposed to rationality? Then why does Eliezer Yudkowsky, for example, not hesitate to advocate for causes such as friendly AI? Doesn’t he have an ax to grind? More of one really, since this ax chops trees of gold.
It would seem intellectual honesty would require that you say you reject discussions with people with an ax to grind, unless you grind a similar ax.
From http://www.usingenglish.com: “If you have an axe to grind with someone or about something, you have a grievance, a resentment and you want to get revenge or sort it out.” One can hardly call the unacknowledged emotions of resentment and needing a revenge/retribution compatible with rationality. srdiamond piled a bunch of (partially correct but irrelevant in the context of my comment) negative statements about SI, making these emotions quite clear.
That’s a restrictive definition of “ax to grind,” by the way—it’s normally used to mean any special interest in the subject: “an ulterior often selfish underlying purpose ” (Merriam-Webster’s Collegiate Dictionary)
But I might as well accept your meaning for discussion purposes. If you detect unacknowledged resentment in srdiamond, don’t you detect unacknowledged ambition in Eliezer Yudkowsky?
There’s actually good reason for the broader meaning of “ax to grind.” Any special stake is a bias. I don’t think you can say that someone who you think acts out of resentment, like srdiamond, is more intractably biased than someone who acts out of other forms of narrow self-interest, which almost invariably applies when someone defends something he gets money from.
I don’t think it’s a rational method to treat people differently, as inherently less rational, when they seem resentful. It is only one of many difficult biases. Financial interest is probably more biasing. If you think the arguments are crummy, that’s something else. But the motive—resentment or finances—should probably have little bearing on how a message is treated in serious discussion.
Eliezer certainly has a lot of ambition, but I am surprised to see an accusation that this ambition is unacknowledged.
The impression I get from scanning their comment history is that metaphysicist means to suggest here that EY has ambitions he hasn’t acknowledged (e.g., the ambition to make money without conventional credentials), not that he fails to acknowledge any of the ambitions he has.
Thank you for this analysis, it made me think more about my motivations and their validity. I believe that my decision to permanently disengage from discussions with some people is based on the futility of such discussions in the past, not on the specific reasons they are futile. At some point I simply decide to cut my losses.
Indeed, present company not excluded. The question is whether it permanently prevents the ax-grinder from listening. EY, too, has his share of unacknowledged irrationalities, but both his status and his ability to listen and to provide insights makes engaging him in a discussion a rewarding, if sometimes frustrating experience.
I don’t not know why srdiamond’s need to bash SI is so entrenched, and whether it can be remedied to a degree where he is once again worth talking to, so at this point it is instrumentally rational for me to avoid replying to him.
Well, all we really know is that he chose to. It may be that everyone he works with then privately berated him for it.
That said, I share your sentiment.
Actually, if SI generally endorses this sort of public “airing of dirty laundry,” I encourage others involved in the organization to say so out loud.