Oh come, don’t be hyperbolic. The main things that makes a cult a cult are absent. And I’m under the impression that plenty of places have a standard path for inexperienced people that involves an internship or whatever. And since AI alignment is an infant field, no one has the relevant experience on their resumes. (The OP mentions professional recruiters, but I would guess that the skill of recruiting high-quality programmers doesn’t translate to recruiting high-quality alignment researchers.)
I do agree that, as an outsider, it seems like it should be much more possible to turn money into productive-researcher-hours, even if that requires recruiting people at Tao’s caliber, and the fact that that’s not happening is confusing & worrying to me. (Though I do feel bad for the MIRI people in this conversation; it’s not entirely fair, since if somehow they in fact have good reason to believe that the set of people who can productively contribute is much tinier that we’d hope (eg: Tao said no, and literally everyone else isn’t good enough), they might have to avoid explicitly explaining that to avoid rudeness and bad PR.)
I’m going to keep giving MIRI my money because it seems like everyone else is on a more-doomed path, but as a donor I would prefer to see more visible experimentation (since they said their research agendas didn’t pan out and they don’t see a path to survival). Eg I’m happy with the Visible Thoughts project. (My current guess (hope?) is that they are experimenting with some things that they can’t talk about, which I’m 100% fine with; still seems like some worthwhile experimentation could be public.)
Imagine you’re a 32-year old software engineer with a decade of quality work experience and a Bachelor’s in CS. You apply for a job at Microsoft, and they tell you that since their tech stack is very unusual, you have to do a six-month unpaid internship as part of the application process, and there is no guarantee that you get the job afterwards.
This is not how things work. You hire smart people, then you train them. It can take months before new employees are generating value, and they can always leave before their training is complete. This risk is absorbed by the employer.
If you have an unusual tech stack then the question of how to train people in that tech stack is fairly trivial. In a pre-pragmatic field, the question of how to train people to effectively work in the field is nontrivial.
Wait, is the workshop 6 months? I assumed it was more like a week or two.
This is not how things work. You hire smart people, then you train them
Sometimes that is how things work. Sometimes you do train them first while not paying them, then you hire them. And for most 32-year old software engineers, they have to go through a 4-8 year training credentialing process that you have to pay year’s worth of salary to go to. I don’t see that as a good thing, and indeed the most successful places are famous for not doing that, but still.
To reiterate, I of course definitely agree that they should try using money more. But this is all roughly in the same universe of annoying hoop-jumping as typical jobs, and not roughly in the same universe as the Branch Davidians, and I object to that ridiculous hyperbole.
they have to go through a 4-8 year training process that you have to pay year’s worth of salary to go to
They go through a 4-8 year credentialing process that is a costly and hard-to-Goodhart signal of intelligence, conscientiousness, and obedience. The actual learning is incidental.
The traditional way has its costs and benefits (one insanely wasteful and expensive path that opens up lots of opportunities), as does the MIRI way (a somewhat time-consuming path that opens up a single opportunity). It seems like there’s room for improvement in both, but both are obviously much closer to each other than either one is to Scientology, and that was the absurd comparison I was arguing against in my original comment. And that comparison doesn’t get any less absurd just because getting a computer science degree is a qualification for a lot of things.
Sure it does. I was saying that the traditional pathway is pretty ridiculous and onerous. (And I was saying that to argue that MIRI’s onerous application requirements are more like the traditional pathway and less like Scientology; I am objecting to the hyperbole in calling it the latter.) The response was that the traditional pathway is even more ridiculous and wasteful than I was giving it credit for. So yeah, I’d say that slightly strengthens my argument.
Based on what’s been said in this thread, donating more money to MIRI has precisely zero impact on whether they achieve their goals, so why continue to donate to them?
Based on what’s been said in this thread, donating more money to MIRI has precisely zero impact on whether they achieve their goals
Well obviously, I disagree with this! As I said in my comment, I’m eg tentatively happy about the Visible Thoughts project. I’m hopeful to see more experimentation in the future, hopefully eventually narrowing down to an actual plan.
Worst case scenario, giving them more money now would at least make them more able to “take advantage of a miracle” in the future (though obviously I’m really really hoping for more than that).
That seems a bit like a Pascal’s mugging to me, especially considering there are plenty of other organizations to give to which don’t rely on a potential future miracle which may or may not require a tremendous sum of already existing money in the organization...
Are you taking the view that someone seeing the ad is not going to think MIRI is a cult unless it ticks all the boxes? Because they are actually going to be put off if it ticks any of the boxes
I’m taking the view that most people would think it’s an onerous requirement and they’re not willing to jump through those hoops, not that it’s a cult. It just doesn’t tick the boxes of that, unless we’re defining that so widely as to include, I dunno, the typical “be a good fit for the workplace culture!” requirement that lots of jobs annoyingly have.
It’s obviously much closer to “pay several hundred thousand dollars to be trained at an institution for 4-6 years (an institution that only considers you worthy if your essay about your personality, life goals, values, and how they combat racism is a good match to their mission), and then either have several years of experience or do an unpaid internship with us to have a good chance” than it is to the Peoples Temple. To say otherwise is, as I said, obviously ridiculous hyperbole.
they might have to avoid explicitly explaining that to avoid rudeness and bad PR
Well, I don’t think that is the thing to worry about. Eliezer having high standards would be no news to me, but if I learn about MIRI being dishonest for PR reasons a second time, I am probably going to lose all the trust I have left.
Oh come, don’t be hyperbolic. The main things that makes a cult a cult are absent. And I’m under the impression that plenty of places have a standard path for inexperienced people that involves an internship or whatever. And since AI alignment is an infant field, no one has the relevant experience on their resumes. (The OP mentions professional recruiters, but I would guess that the skill of recruiting high-quality programmers doesn’t translate to recruiting high-quality alignment researchers.)
I do agree that, as an outsider, it seems like it should be much more possible to turn money into productive-researcher-hours, even if that requires recruiting people at Tao’s caliber, and the fact that that’s not happening is confusing & worrying to me. (Though I do feel bad for the MIRI people in this conversation; it’s not entirely fair, since if somehow they in fact have good reason to believe that the set of people who can productively contribute is much tinier that we’d hope (eg: Tao said no, and literally everyone else isn’t good enough), they might have to avoid explicitly explaining that to avoid rudeness and bad PR.)
I’m going to keep giving MIRI my money because it seems like everyone else is on a more-doomed path, but as a donor I would prefer to see more visible experimentation (since they said their research agendas didn’t pan out and they don’t see a path to survival). Eg I’m happy with the Visible Thoughts project. (My current guess (hope?) is that they are experimenting with some things that they can’t talk about, which I’m 100% fine with; still seems like some worthwhile experimentation could be public.)
Imagine you’re a 32-year old software engineer with a decade of quality work experience and a Bachelor’s in CS. You apply for a job at Microsoft, and they tell you that since their tech stack is very unusual, you have to do a six-month unpaid internship as part of the application process, and there is no guarantee that you get the job afterwards.
This is not how things work. You hire smart people, then you train them. It can take months before new employees are generating value, and they can always leave before their training is complete. This risk is absorbed by the employer.
If you have an unusual tech stack then the question of how to train people in that tech stack is fairly trivial. In a pre-pragmatic field, the question of how to train people to effectively work in the field is nontrivial.
Wait, is the workshop 6 months? I assumed it was more like a week or two.
Sometimes that is how things work. Sometimes you do train them first while not paying them, then you hire them. And for most 32-year old software engineers, they have to go through a 4-8 year
trainingcredentialing process that you have to pay year’s worth of salary to go to. I don’t see that as a good thing, and indeed the most successful places are famous for not doing that, but still.To reiterate, I of course definitely agree that they should try using money more. But this is all roughly in the same universe of annoying hoop-jumping as typical jobs, and not roughly in the same universe as the Branch Davidians, and I object to that ridiculous hyperbole.
They go through a 4-8 year credentialing process that is a costly and hard-to-Goodhart signal of intelligence, conscientiousness, and obedience. The actual learning is incidental.
Okay, edited. If anything, that strengthens my point.
See this comment.
… And? What point do you think I’m arguing?
The traditional way has its costs and benefits (one insanely wasteful and expensive path that opens up lots of opportunities), as does the MIRI way (a somewhat time-consuming path that opens up a single opportunity). It seems like there’s room for improvement in both, but both are obviously much closer to each other than either one is to Scientology, and that was the absurd comparison I was arguing against in my original comment. And that comparison doesn’t get any less absurd just because getting a computer science degree is a qualification for a lot of things.
No, it doesn’t.
Sure it does. I was saying that the traditional pathway is pretty ridiculous and onerous. (And I was saying that to argue that MIRI’s onerous application requirements are more like the traditional pathway and less like Scientology; I am objecting to the hyperbole in calling it the latter.) The response was that the traditional pathway is even more ridiculous and wasteful than I was giving it credit for. So yeah, I’d say that slightly strengthens my argument.
Based on what’s been said in this thread, donating more money to MIRI has precisely zero impact on whether they achieve their goals, so why continue to donate to them?
FWIW I don’t donate to MIRI anymore much myself precisely because they aren’t funding-constrained. And a MIRI employee even advised me as much.
Well obviously, I disagree with this! As I said in my comment, I’m eg tentatively happy about the Visible Thoughts project. I’m hopeful to see more experimentation in the future, hopefully eventually narrowing down to an actual plan.
Worst case scenario, giving them more money now would at least make them more able to “take advantage of a miracle” in the future (though obviously I’m really really hoping for more than that).
That seems a bit like a Pascal’s mugging to me, especially considering there are plenty of other organizations to give to which don’t rely on a potential future miracle which may or may not require a tremendous sum of already existing money in the organization...
Are you taking the view that someone seeing the ad is not going to think MIRI is a cult unless it ticks all the boxes? Because they are actually going to be put off if it ticks any of the boxes
I’m taking the view that most people would think it’s an onerous requirement and they’re not willing to jump through those hoops, not that it’s a cult. It just doesn’t tick the boxes of that, unless we’re defining that so widely as to include, I dunno, the typical “be a good fit for the workplace culture!” requirement that lots of jobs annoyingly have.
It’s obviously much closer to “pay several hundred thousand dollars to be trained at an institution for 4-6 years (an institution that only considers you worthy if your essay about your personality, life goals, values, and how they combat racism is a good match to their mission), and then either have several years of experience or do an unpaid internship with us to have a good chance” than it is to the Peoples Temple. To say otherwise is, as I said, obviously ridiculous hyperbole.
Well, I don’t think that is the thing to worry about. Eliezer having high standards would be no news to me, but if I learn about MIRI being dishonest for PR reasons a second time, I am probably going to lose all the trust I have left.
I don’t think “no comment”, or rather making undetailed but entirely true comments, is dishonest.
I agree.