One story we could tell is that the thing these people have in common is that they take alignment seriously, not that they are generally pessimists.
I think alignment is unsolved in the general case and so this makes it harder to strongly argue that it will get solved for future systems, but I don’t buy that people would not update on seeing a solution or strong arguments for that conclusion, and I think that some of Quintin’s and Nora’s arguments have caused people I know to rethink their positions and update some in that direction.
I think the rationalist and EA spaces have been healthy enough for people to express quite extreme positions of expecting an AI-takeover-slash-extinction. I think it would be a strongly negative sign for everyone in these spaces to have identical views or for everyone to give up all hope on civilization’s prospects; but in the absence of that I think it’s a sign of health that people are able to be open about having very strong views. I also think the people who most confidently anticipate an AI takeover sometimes feel and express hope.
I don’t think everyone is starting with pessimism as their bottom line, and I think it’s inaccurate to describe the majority of people in these ecosystems as temperamentally pessimistic or epistemically pessimistic.
A few related thoughts:
One story we could tell is that the thing these people have in common is that they take alignment seriously, not that they are generally pessimists.
I think alignment is unsolved in the general case and so this makes it harder to strongly argue that it will get solved for future systems, but I don’t buy that people would not update on seeing a solution or strong arguments for that conclusion, and I think that some of Quintin’s and Nora’s arguments have caused people I know to rethink their positions and update some in that direction.
I think the rationalist and EA spaces have been healthy enough for people to express quite extreme positions of expecting an AI-takeover-slash-extinction. I think it would be a strongly negative sign for everyone in these spaces to have identical views or for everyone to give up all hope on civilization’s prospects; but in the absence of that I think it’s a sign of health that people are able to be open about having very strong views. I also think the people who most confidently anticipate an AI takeover sometimes feel and express hope.
I don’t think everyone is starting with pessimism as their bottom line, and I think it’s inaccurate to describe the majority of people in these ecosystems as temperamentally pessimistic or epistemically pessimistic.