Given recent discussion of short timelines & take off on LessWrong, AlignmentForm, and broadly, I’ve been quite worried. I want to be as skeptical as I can, but it’s hard to judge anything: I don’t know what information I’m missing from the timeline estimates I hear, and increasingly strange or concerning things are put out in public regularly. I can’t really say how likely short timelines are, but given what’s happening, it’s absurd to dismiss this as all collective delusion, and it seems something serious is happening.
Because of my information deficit, I don’t have any idea what work will lead to positive outcomes in these conditions, which of those problems I could likely contribute the most to, and how I could start on that work.
So I feel my current task is to gather a theory of positive impact in this situation and work out from there. I think it would be extremely useful to talk with as many people who think about this as I can 1:1 to understand what they think are the relevant factors.
Given recent discussion of short timelines & take off on LessWrong, AlignmentForm, and broadly, I’ve been quite worried. I want to be as skeptical as I can, but it’s hard to judge anything: I don’t know what information I’m missing from the timeline estimates I hear, and increasingly strange or concerning things are put out in public regularly. I can’t really say how likely short timelines are, but given what’s happening, it’s absurd to dismiss this as all collective delusion, and it seems something serious is happening.
Because of my information deficit, I don’t have any idea what work will lead to positive outcomes in these conditions, which of those problems I could likely contribute the most to, and how I could start on that work.
So I feel my current task is to gather a theory of positive impact in this situation and work out from there. I think it would be extremely useful to talk with as many people who think about this as I can 1:1 to understand what they think are the relevant factors.