Indeed, these are items on a ‘high-level reasons not to be maximally pessimistic about AGI’ list I made for some friends three years ago. Maybe I’ll post that on LW in the next week or two.
I share Eliezer’s pessimism, but I worry that some people only have negative factors bouncing around in their minds, and not positive factors, and that this is making them overshoot Eliezer’s ‘seems very dire’ and go straight to ‘seems totally hopeless’. (Either with regard to alignment research, or with regard to the whole problem. Maybe also related to the tendency IME for people to either assume a problem is easy or impossible, without much room in between.)
I agree with the first two paragraphs here. :)
Indeed, these are items on a ‘high-level reasons not to be maximally pessimistic about AGI’ list I made for some friends three years ago. Maybe I’ll post that on LW in the next week or two.
I share Eliezer’s pessimism, but I worry that some people only have negative factors bouncing around in their minds, and not positive factors, and that this is making them overshoot Eliezer’s ‘seems very dire’ and go straight to ‘seems totally hopeless’. (Either with regard to alignment research, or with regard to the whole problem. Maybe also related to the tendency IME for people to either assume a problem is easy or impossible, without much room in between.)