Unknown wrote:
As I’ve stated before, we are all morally obliged to prevent Eliezer from programming an AI.
As Bayesians, educated by Mr. Yudkowsky himself, I think we all know the probability of such an event is quite low. In 2004, in the most moving and intelligent eulogy I have ever read, Mr. Y stated: “When Michael Wilson heard the news, he said: “We shall have to work faster.” Any similar condolences are welcome. Other condolences are not.” Somewhere, some person or group is working faster, but at the Singularity Institute, all the time is being spent on somewhat brilliant and very entertaining writing. I shall continue to read and reflect, for my own enjoyment. But I hope those others I mentioned have Mr. Y’s native abilities, because I agree with Woody Allen: “I don’t want to achieve immortality through my work. I want to achieve it by not dying.”
Unknown wrote:
As I’ve stated before, we are all morally obliged to prevent Eliezer from programming an AI.
As Bayesians, educated by Mr. Yudkowsky himself, I think we all know the probability of such an event is quite low. In 2004, in the most moving and intelligent eulogy I have ever read, Mr. Y stated: “When Michael Wilson heard the news, he said: “We shall have to work faster.” Any similar condolences are welcome. Other condolences are not.” Somewhere, some person or group is working faster, but at the Singularity Institute, all the time is being spent on somewhat brilliant and very entertaining writing. I shall continue to read and reflect, for my own enjoyment. But I hope those others I mentioned have Mr. Y’s native abilities, because I agree with Woody Allen: “I don’t want to achieve immortality through my work. I want to achieve it by not dying.”