Yes, but there are also many examples that show people coming up with the same idea or conclusion at the same time. Take for example A. N. Kolmogorov and Gregory Chaitin who proposed the same definition of randomness independently.
The circumstances regarding Eliezer Yudkowsky are however different. Other people came up with the ideas he is using as supportive fortification and pronunciamento. Some of those people even made similar inferences, yet they do not ask for donations to stop the otherwise inevitable apocalypse.
Your argument does not seem to work. I pointed out how there is stupidity in professionals, but I made no claim that there is only stupidity. So your samples do not disprove the point.
It is nice when people come up with similar things, especially if they happen to be correct, but it is by no means to be expected in every case.
Would you be interested in taking specific pieces apart and/or arguing them?
The argument was that Eliezer Yudkowsky, to my knowledge, has not come up with anything unique. The ideas on which the SIAI is based and asks for donations are not new. Given the basic idea of superhuman AI and widespread awareness of it I thought it was not unreasonable to inquire about the state of activists trying to prevent it.
Are you trying to disprove an argument I made? I asked for an explanation and wasn’t stating some insight about why the SIAI is wrong.
Is Robin Hanson donating most of his income to the SIAI?
Yes, but there are also many examples that show people coming up with the same idea or conclusion at the same time. Take for example A. N. Kolmogorov and Gregory Chaitin who proposed the same definition of randomness independently.
The circumstances regarding Eliezer Yudkowsky are however different. Other people came up with the ideas he is using as supportive fortification and pronunciamento. Some of those people even made similar inferences, yet they do not ask for donations to stop the otherwise inevitable apocalypse.
Your argument does not seem to work. I pointed out how there is stupidity in professionals, but I made no claim that there is only stupidity. So your samples do not disprove the point. It is nice when people come up with similar things, especially if they happen to be correct, but it is by no means to be expected in every case. Would you be interested in taking specific pieces apart and/or arguing them?
The argument was that Eliezer Yudkowsky, to my knowledge, has not come up with anything unique. The ideas on which the SIAI is based and asks for donations are not new. Given the basic idea of superhuman AI and widespread awareness of it I thought it was not unreasonable to inquire about the state of activists trying to prevent it.
Are you trying to disprove an argument I made? I asked for an explanation and wasn’t stating some insight about why the SIAI is wrong.
Is Robin Hanson donating most of his income to the SIAI?