I disagree that it is as inaccurate as you claim. Specifically, they did actually say that “AI doomsday scenarios belong more in the realm of science fiction”. I don’t think it’s inaccurate to quote what someone actually said.
When they talk about “having more work to do” etc, it seems that they are emphasizing risks of sub-human intelligence and de-emphasizing the risks of superintelligence.
Of course LW being LW I know that balance and fairness is valued very highly, so would you kindly suggest what you think the title should be and I will change it.
I disagree that it is as inaccurate as you claim. Specifically, they did actually say that “AI doomsday scenarios belong more in the realm of science fiction”. I don’t think it’s inaccurate to quote what someone actually said.
When they talk about “having more work to do” etc, it seems that they are emphasizing risks of sub-human intelligence and de-emphasizing the risks of superintelligence.
Of course LW being LW I know that balance and fairness is valued very highly, so would you kindly suggest what you think the title should be and I will change it.
I will also add in the paragraphs you suggest.