I am interpreting IlyaShpitser as commenting on OphilaDros’s presentation; why say Ng “disses” UFAI concerns instead of “dismisses” them?
It also doesn’t help that the underlying content is a handful of different issues that all bleed together: the orthogonality question, the imminence question, and the Hollywood question. Ng is against Hollywood and against imminence, and I haven’t read enough of his writing on the subject to be sure of his thoughts on orthogonality, which is one of the actual meaningful points of contention between MIRI and other experts on the issue. (And even those three don’t touch on Ng’s short objection, that he doesn’t see a fruitful open problem!)
My impression was that imminence is a point of contention, much less orthogonality.
This article is a good place to start in clarifying the MIRI position. Since their estimate for imminence seems to boil down to “we asked the community what they thought and made a distribution,” I don’t see that as contention.
There is broad uncertainty about timelines, but the MIRI position is “uncertainty means we should not be confident we have all the time we need,” not “we’re confident it will happen soon,” which is where someone would need to be for me to say they’re “for imminence.”
Interesting. I considered imminence more of a point of contention b/c the most outspoken “AI risk is overhyped” people are mostly using it as an argument (and I consider this bunch way more serious than Searle and Brooks: Yann LeCun, Yoshua Bengio, Andrew Ng).
Ok, sure. Changed the title in line with Vaniver’s suggestion.
I had not understood what the “tribal talk” comment was referring to either and then decided to put only as much effort into understanding it as the commenter had in being understood. :)
Which tribe is Ng in? (if that’s what you are talking about)
I am interpreting IlyaShpitser as commenting on OphilaDros’s presentation; why say Ng “disses” UFAI concerns instead of “dismisses” them?
It also doesn’t help that the underlying content is a handful of different issues that all bleed together: the orthogonality question, the imminence question, and the Hollywood question. Ng is against Hollywood and against imminence, and I haven’t read enough of his writing on the subject to be sure of his thoughts on orthogonality, which is one of the actual meaningful points of contention between MIRI and other experts on the issue. (And even those three don’t touch on Ng’s short objection, that he doesn’t see a fruitful open problem!)
My impression was that imminence is a point of contention, much less orthogonality. Who specifically do you have in mind?
This article is a good place to start in clarifying the MIRI position. Since their estimate for imminence seems to boil down to “we asked the community what they thought and made a distribution,” I don’t see that as contention.
There is broad uncertainty about timelines, but the MIRI position is “uncertainty means we should not be confident we have all the time we need,” not “we’re confident it will happen soon,” which is where someone would need to be for me to say they’re “for imminence.”
Interesting. I considered imminence more of a point of contention b/c the most outspoken “AI risk is overhyped” people are mostly using it as an argument (and I consider this bunch way more serious than Searle and Brooks: Yann LeCun, Yoshua Bengio, Andrew Ng).
Yes I didn’t mean Ng. “Diss” is sort of unfortunate phrasing, he just wants to get work done. Sorry for being unclear.
Ok, sure. Changed the title in line with Vaniver’s suggestion.
I had not understood what the “tribal talk” comment was referring to either and then decided to put only as much effort into understanding it as the commenter had in being understood. :)