Yes, I agree. I think it is important to remind that achieving AGI and doom are two separate events. Many people around here do make a strong connection between them, but not everyone. I’m on the camp that we are 2 or 3 years away to an AGI (it’s hard to see why GPT4 does not qualify as that), I don’t think that implies the imminent extinction of human beings. It is much easier to convince people of the first point because the evidence is already out there
Yes, I agree. I think it is important to remind that achieving AGI and doom are two separate events. Many people around here do make a strong connection between them, but not everyone. I’m on the camp that we are 2 or 3 years away to an AGI (it’s hard to see why GPT4 does not qualify as that), I don’t think that implies the imminent extinction of human beings. It is much easier to convince people of the first point because the evidence is already out there