The idea of getting FAI contributors who are unlikely to ever switch jobs seems like it might be the most stringent hiring requirement. It might be worthwhile to look into people who gain government clearances and then move to a nongovernment job, to see if they abuse the top-secret information they had access to.
There are two issues here; you’ve only described one. The first is someone moving to a team that builds a non-Friendly AGI. The second is simply someone moving away—no matter what they go on to do, SI has lost the benefit of their contribution. Someone who is really “deeply committed to AI risk reduction”, would not leave the FAI effort for “mere money” offered by Google. Or so the OP suggests.
The idea of getting FAI contributors who are unlikely to ever switch jobs seems like it might be the most stringent hiring requirement. It might be worthwhile to look into people who gain government clearances and then move to a nongovernment job, to see if they abuse the top-secret information they had access to.
There are two issues here; you’ve only described one. The first is someone moving to a team that builds a non-Friendly AGI. The second is simply someone moving away—no matter what they go on to do, SI has lost the benefit of their contribution. Someone who is really “deeply committed to AI risk reduction”, would not leave the FAI effort for “mere money” offered by Google. Or so the OP suggests.