Daniel, your interpretation is literally contradicted by Eliezer’s exact words. Eliezer defines dignity as that which increases our chance of survival.
“”Wait, dignity points?” you ask. “What are those? In what units are they measured, exactly?”
And to this I reply: Obviously, the measuring units of dignity are over humanity’s log odds of survival—the graph on which the logistic success curve is a straight line. A project that doubles humanity’s chance of survival from 0% to 0% is helping humanity die with one additional information-theoretic bit of dignity.”
I don’t think our chances of survival will increase if LessWrong becomes substantially more risk-averse about publishing research and musings about AI. I think they will decrease.
Daniel, your interpretation is literally contradicted by Eliezer’s exact words. Eliezer defines dignity as that which increases our chance of survival.
“”Wait, dignity points?” you ask. “What are those? In what units are they measured, exactly?”
And to this I reply: Obviously, the measuring units of dignity are over humanity’s log odds of survival—the graph on which the logistic success curve is a straight line. A project that doubles humanity’s chance of survival from 0% to 0% is helping humanity die with one additional information-theoretic bit of dignity.”
I don’t think our chances of survival will increase if LessWrong becomes substantially more risk-averse about publishing research and musings about AI. I think they will decrease.