Fenty, I didn’t mean to suggest that people with massive resources are unfriendly more than others, but more that people with power have little reason to respect those without power. Humans have a poor track record of coercive paternalizing regardless of stated motives (I believe both Bryan and Eliezer have posted about that quite a bit in the past). I just don’t think the people with the capabilities to get the first AGI online would posses the impeccable level of friendliness needed, or anywhere near it.
If Eliezer is right about the potential of AGI, then building the first one for the good of humanity might be irrational because it might spark an AI-arms-race (which would almost certainly lower the quality of friendliness of the AIs).
Fenty, I didn’t mean to suggest that people with massive resources are unfriendly more than others, but more that people with power have little reason to respect those without power. Humans have a poor track record of coercive paternalizing regardless of stated motives (I believe both Bryan and Eliezer have posted about that quite a bit in the past). I just don’t think the people with the capabilities to get the first AGI online would posses the impeccable level of friendliness needed, or anywhere near it.
If Eliezer is right about the potential of AGI, then building the first one for the good of humanity might be irrational because it might spark an AI-arms-race (which would almost certainly lower the quality of friendliness of the AIs).