Benquo isn’t saying that these attitudes necessarily follow, but that in practice he’s seen it happen. There is a lot of unspoken LessWrong / SIAI history here. Eliezer Yudkowsky and many others “at the top” of SIAI felt personally responsible for the fate of the human race. EY believed he needed to develop an AI to save humanity, but for many years he would only discuss his thoughts on AI with one other person, not trusting even the other people in SIAI, and requiring them to leave the area when the two of them talked about AI. (For all I know, he still does that.) And his plans basically involve creating an AI to become world dictator and stop anybody else from making an AI. All of that is reducing the agency of others “for their own good.”
This secrecy was endemic at SIAI; when I’ve walked around NYC with their senior members, sometimes 2 or 3 people would gather together and whisper, and would ask anyone who got too close to please walk further away, because the ideas they were discussing were “too dangerous” to share with the rest of the group.
Well, that’s… unfortunate. I apparently don’t hang around in the same circles, because I have not seen this kind of behaviour among the Effective Altruists I know.
Benquo isn’t saying that these attitudes necessarily follow, but that in practice he’s seen it happen. There is a lot of unspoken LessWrong / SIAI history here. Eliezer Yudkowsky and many others “at the top” of SIAI felt personally responsible for the fate of the human race. EY believed he needed to develop an AI to save humanity, but for many years he would only discuss his thoughts on AI with one other person, not trusting even the other people in SIAI, and requiring them to leave the area when the two of them talked about AI. (For all I know, he still does that.) And his plans basically involve creating an AI to become world dictator and stop anybody else from making an AI. All of that is reducing the agency of others “for their own good.”
This secrecy was endemic at SIAI; when I’ve walked around NYC with their senior members, sometimes 2 or 3 people would gather together and whisper, and would ask anyone who got too close to please walk further away, because the ideas they were discussing were “too dangerous” to share with the rest of the group.
Well, that’s… unfortunate. I apparently don’t hang around in the same circles, because I have not seen this kind of behaviour among the Effective Altruists I know.