As Ben said, this seems incongruent with the responses that the other two people gave, neither of which talked that much about timelines, but did seem to directly respond to the concern about catastrophic/apocalyptic risk from AGI.
I do agree that it’s plausible that Matheny somehow understood the question differently from the other two people, and interpreted it in a more timelines focused way, though he also heard the other two people talk, which makes that somewhat less likely. I do agree that the question wasn’t asked in the most cogent way.
Thanks for checking this! I mostly agree with all your original comment now (except the first part suggesting it was point blank, but we’re quibbling over definitions at this point), this does seem like a case of intentionally not discussing risk
As Ben said, this seems incongruent with the responses that the other two people gave, neither of which talked that much about timelines, but did seem to directly respond to the concern about catastrophic/apocalyptic risk from AGI.
I do agree that it’s plausible that Matheny somehow understood the question differently from the other two people, and interpreted it in a more timelines focused way, though he also heard the other two people talk, which makes that somewhat less likely. I do agree that the question wasn’t asked in the most cogent way.
Thanks for checking this! I mostly agree with all your original comment now (except the first part suggesting it was point blank, but we’re quibbling over definitions at this point), this does seem like a case of intentionally not discussing risk