If you believe my moral system (not the topic of this post) is patently absurd, please PM me the full version of your argument. I promise to review it with an open mind. Note: I am naturally afraid of torture outcomes, but that doesn’t mean I’m not excited about FAI. That would be patently absurd.
Torture mostly comes up because philosophical thought-experiments...
To clarify: are you saying there is no chance of torture?
Yes, I am saying that the scenario you allude to is vanishingly unlikely.
But there’s another point, which cuts close to the core of my values, and I suspect it cuts close to the core of your values, too. Rather than explain it myself, I’m going to suggest reading Scott Alexander’s Who By Very Slow Decay, which is about aging.
That’s the status quo. That’s one of the main the reasons I, personally, care about AI: because if it’s done right, then the thing Scott describes won’t be a part of the world anymore.
If you believe my moral system (not the topic of this post) is patently absurd, please PM me the full version of your argument. I promise to review it with an open mind. Note: I am naturally afraid of torture outcomes, but that doesn’t mean I’m not excited about FAI. That would be patently absurd.
To clarify: are you saying there is no chance of torture?
Yes, I am saying that the scenario you allude to is vanishingly unlikely.
But there’s another point, which cuts close to the core of my values, and I suspect it cuts close to the core of your values, too. Rather than explain it myself, I’m going to suggest reading Scott Alexander’s Who By Very Slow Decay, which is about aging.
That’s the status quo. That’s one of the main the reasons I, personally, care about AI: because if it’s done right, then the thing Scott describes won’t be a part of the world anymore.
Good piece, thank you for sharing it.
I agree with you and Scott Alexander—painful death from aging is awful.