By my quick mental count, CHAI’s Berkeley branch had something like the equivalent of 8 to 11 researchers focussing on AI alignment in 2018. Kind of tricky to count because we had new PhD students coming in in August, as well as some interns over the summer (some of whom stayed on for longer periods).
Hmm. I notice that in the case of AI safety, it’s probably possible to just literally count the researchers by hand. I assume for “broader work on AI” it’d be necessary to either consult some kind of research that already had them counted, since there’s just way too much stuff going on.
I notice that in the case of AI safety, it’s probably possible to just literally count the researchers by hand.
I think this is probably not true for the average LW reader, or even the average person who’s kind of interested in AI alignment, since many orgs are sort of opaque about how many people work there and what team people are on. For example my guess is that most people don’t know how many interns CHAI takes, or how many new PhD students we get in a given year, and similarly, I’m not even confident that I could name everybody in OpenAI’s safety team without someone to catch my errors.
I assume for “broader work on AI” it’d be necessary to either consult some kind of research that already had them counted, since there’s just way too much stuff going on.
Nod. I didn’t mean you could count them trivially, but I hadn’t e en been thinking of the solution ‘someone from each org just mentions the approximate number of researchers and then you add them’ as a possible solution
By my quick mental count, CHAI’s Berkeley branch had something like the equivalent of 8 to 11 researchers focussing on AI alignment in 2018. Kind of tricky to count because we had new PhD students coming in in August, as well as some interns over the summer (some of whom stayed on for longer periods).
Hmm. I notice that in the case of AI safety, it’s probably possible to just literally count the researchers by hand. I assume for “broader work on AI” it’d be necessary to either consult some kind of research that already had them counted, since there’s just way too much stuff going on.
I think this is probably not true for the average LW reader, or even the average person who’s kind of interested in AI alignment, since many orgs are sort of opaque about how many people work there and what team people are on. For example my guess is that most people don’t know how many interns CHAI takes, or how many new PhD students we get in a given year, and similarly, I’m not even confident that I could name everybody in OpenAI’s safety team without someone to catch my errors.
Seems correct to me.
Nod. I didn’t mean you could count them trivially, but I hadn’t e en been thinking of the solution ‘someone from each org just mentions the approximate number of researchers and then you add them’ as a possible solution