Even if your characterization of AI was accurate, an unaligned AI could find it valuable to dedicate part of its resources to helping humans. In your analogy, isn’t that the role of myrmecologists today? If it were ants:us = us:AI, solving our problems wouldn’t require a significant expenditure of resources.
Even if your characterization of AI was accurate, an unaligned AI could find it valuable to dedicate part of its resources to helping humans. In your analogy, isn’t that the role of myrmecologists today? If it were ants:us = us:AI, solving our problems wouldn’t require a significant expenditure of resources.