[EDIT: fallenpegasus points out that there’s a low bar to entry to this corner of TIME’s website. I have to say I should have been confused that even now they let Eliezer write in his own idiom.]
The Eliezer of 2010 had no shot of being directly published (instead of featured in an interview that at best paints him as a curiosity) in TIME of 2010. I’m not sure about 2020.
I wonder at what point the threshold of “admitting it’s at least okay to discuss Eliezer’s viewpoint at face value” was crossed for the editors of TIME. I fear the answer is “last month”.
Public attention is rare and safety measures are even more rare unless there’s real world damage. This is a known pattern in engineering, product design and project planning so I fear there will be little public attention and even less legislation until someone gets hurt by AI. That could take the form of a hot coffee type incident or it could be a Chernobyl type incident. The threshold won’t be discussing Eliezer’s point of view, we’ve been doing that for a long time, but losing sleep over Eliezer’s point of view. I appreciate in the article Yudkowsky’s use of the think-of-the-children stance which has a great track record for sparking legislation.
[EDIT: fallenpegasus points out that there’s a low bar to entry to this corner of TIME’s website. I have to say I should have been confused that even now they let Eliezer write in his own idiom.]
The Eliezer of 2010 had no shot of being directly published (instead of featured in an interview that at best paints him as a curiosity) in TIME of 2010. I’m not sure about 2020.
I wonder at what point the threshold of “admitting it’s at least okay to discuss Eliezer’s viewpoint at face value” was crossed for the editors of TIME. I fear the answer is “last month”.
Public attention is rare and safety measures are even more rare unless there’s real world damage. This is a known pattern in engineering, product design and project planning so I fear there will be little public attention and even less legislation until someone gets hurt by AI. That could take the form of a hot coffee type incident or it could be a Chernobyl type incident. The threshold won’t be discussing Eliezer’s point of view, we’ve been doing that for a long time, but losing sleep over Eliezer’s point of view. I appreciate in the article Yudkowsky’s use of the think-of-the-children stance which has a great track record for sparking legislation.