I personally think pessimistic vs. optimistic misframes it, because it frames a question about the world in terms of personal predispositions.
I would like to see reasoning.
Your reasoning in the comment thread you linked to is:
“history is full of cases where people dramatically underestimated the growth of scientific knowledge, and its ability to solve big problems”
That’s a broad reference-class analogy to use. I think it holds little to no weight as to whether there would be sufficient progress on the specific problem of “AGI” staying safe over the long-term.
I wrote why that specifically would not be a solvable problem.
I personally think pessimistic vs. optimistic misframes it, because it frames a question about the world in terms of personal predispositions.
I would like to see reasoning.
Your reasoning in the comment thread you linked to is: “history is full of cases where people dramatically underestimated the growth of scientific knowledge, and its ability to solve big problems”
That’s a broad reference-class analogy to use. I think it holds little to no weight as to whether there would be sufficient progress on the specific problem of “AGI” staying safe over the long-term.
I wrote why that specifically would not be a solvable problem.