I guess part of the issue is that in any discussion, people don’t use the same terms in the same way. Some people call present-day AI capabilities by terms like “superintelligent” in a specific domain. Which is not how I understand the term, but I understand where the idea to call it that comes from. But of course such mismatched definitions make discussions really hard. Seeing stuff like that makes it very understandable why Yudkowsky wrote the LW Sequences...
Anyway, here is an example of a recent shortform post which grapples with the same issue that vague terms are confusing.
I guess part of the issue is that in any discussion, people don’t use the same terms in the same way. Some people call present-day AI capabilities by terms like “superintelligent” in a specific domain. Which is not how I understand the term, but I understand where the idea to call it that comes from. But of course such mismatched definitions make discussions really hard. Seeing stuff like that makes it very understandable why Yudkowsky wrote the LW Sequences...
Anyway, here is an example of a recent shortform post which grapples with the same issue that vague terms are confusing.