As a subset of this question, do you think that establishing a school with the express purpose of training future rationalists/AGI programmers from an early age is a good idea? Don’t you think that people who’ve been raised with strong epistemic hygiene should be building AGI rather than people who didn’t acquire such hygiene until later in life?
The only reasons I can see for it not working would be:
predictions that AGIs will come before the next generation of rationalists comes along. (which is also a question of how early to start such an education program).
belief that our current researchers are up to the challenge. (even then, having lots of people who’ve had a structured education designed to produce the best FAI researchers would undeniably reduce existential risk. no?)
EDIT (for clarification): Eliezer has said:
“I think that saving the human species eventually comes down to, metaphorically speaking, nine people and a brain in a box in a basement”
Just as they would be building an intelligence greater than themselves, so to must we build human intelligences greater than ourselves.
The only reasons I can see for it not working would be: 1. predictions that AGIs will come before the next generation of rationalists comes along. (which is also a question of how early to start such an education program). 2. belief that our current researchers are up to the challenge. (even then, having lots of people who’ve had a structured education designed to produce the best FAI researchers would undeniably reduce existential risk. no?)
I can’t speak for the SIAI, but to me this sounds like a suboptimal use of resources, and bad PR. It trips my “this would sound cultish to the average person” buzzer. Starting a school that claimed it “emphasized critical thinking” to teach rationalists might be a good idea for someone with administrative talents who wanted to work on x-risk, but I can’t see SIAI doing it.
How would you distribute resources? I think this is a natural response if one accepts the premise that the main bottleneck to AGI is a few key insights by geniuses (as Eliezer says).
Why do we care if people who aren’t logical enough to see the reasoning behind the school think we’re cultish?
As a subset of this question, do you think that establishing a school with the express purpose of training future rationalists/AGI programmers from an early age is a good idea? Don’t you think that people who’ve been raised with strong epistemic hygiene should be building AGI rather than people who didn’t acquire such hygiene until later in life?
The only reasons I can see for it not working would be:
predictions that AGIs will come before the next generation of rationalists comes along. (which is also a question of how early to start such an education program).
belief that our current researchers are up to the challenge. (even then, having lots of people who’ve had a structured education designed to produce the best FAI researchers would undeniably reduce existential risk. no?)
EDIT (for clarification): Eliezer has said:
“I think that saving the human species eventually comes down to, metaphorically speaking, nine people and a brain in a box in a basement”
Just as they would be building an intelligence greater than themselves, so to must we build human intelligences greater than ourselves.
I can’t speak for the SIAI, but to me this sounds like a suboptimal use of resources, and bad PR. It trips my “this would sound cultish to the average person” buzzer. Starting a school that claimed it “emphasized critical thinking” to teach rationalists might be a good idea for someone with administrative talents who wanted to work on x-risk, but I can’t see SIAI doing it.
How would you distribute resources? I think this is a natural response if one accepts the premise that the main bottleneck to AGI is a few key insights by geniuses (as Eliezer says).
Why do we care if people who aren’t logical enough to see the reasoning behind the school think we’re cultish?