Yes, I do expect that if we don’t get wiped out that maybe we’ll get somewhat bigger “warning shots” that humanity may be likely to pay more attention to. I don’t know how much that actually moves the needle though.
Ok sure but extra resources and attention is still better than none.
This isn’t obvious to me, it might make things harder. Like how when Elon Musk read Superintelligence and started developing concerns about AI risk but the result was that he founded OpenAI and gave it a billion dollars to play with, regarding which I think you could make an argument that doing so accelerated timelines and reduced our chances of avoiding negative outcomes.
Yes, I do expect that if we don’t get wiped out that maybe we’ll get somewhat bigger “warning shots” that humanity may be likely to pay more attention to. I don’t know how much that actually moves the needle though.
This isn’t obvious to me, it might make things harder. Like how when Elon Musk read Superintelligence and started developing concerns about AI risk but the result was that he founded OpenAI and gave it a billion dollars to play with, regarding which I think you could make an argument that doing so accelerated timelines and reduced our chances of avoiding negative outcomes.