The main funders are LTFF, SFF/Lightspeed/other S-process stuff from Jaan Tallinn, and Open Phil. LTFF is the main one that solicits independent researcher grant applications.
There’s a lot of orgs, off the top of my head, there’s Anthropic/OpenAI/GDM as the scaling labs with decent-sized alignment teams, and then there’s a bunch of smaller/independent orgs:
The main funders are LTFF, SFF/Lightspeed/other S-process stuff from Jaan Tallinn, and Open Phil. LTFF is the main one that solicits independent researcher grant applications.
There’s a lot of orgs, off the top of my head, there’s Anthropic/OpenAI/GDM as the scaling labs with decent-sized alignment teams, and then there’s a bunch of smaller/independent orgs:
Alignment Research Center
Apollo Research
CAIS
CLR
Conjecture
FAR
Orthogonal
Redwood Research
And there’s always academia.
(I’m sure I’m missing a few though!)
(EDIT: added in RR and CLR)
Redwood Research?
I don’t think they’re hiring, but added.
Center on Long-term Risk (CLR)
In France, EffiSciences is looking for new members and interns.