I guess I’d recommend the AGI safety fundamentals course: https://www.eacambridge.org/technical-alignment-curriculum
On Stuart’s list: I think this list might be suitable for some types of conceptual alignment research. But you’d certainly want to read more ML for other types of alignment research.
I guess I’d recommend the AGI safety fundamentals course: https://www.eacambridge.org/technical-alignment-curriculum
On Stuart’s list: I think this list might be suitable for some types of conceptual alignment research. But you’d certainly want to read more ML for other types of alignment research.