Here’s something I’ve linked to a number of times on LessWrong, but as a newcomer (I think, despite the lack of a green shoot by your name) you will likely not have seen it before. I wrote it, inspired by a lot of stuff I’ve seen. It approaches the topic of the post in a very different manner. I’d be interested in what answer you would make to Insanity Wolf. Beware: here be dragons.
I can ask it “tell me the truth, is this eventually going to result in my eyes being pecked out by seagulls?” and if it answers “yes, I have a series of twenty-eight switches, and each one is obviously better than the one before, and the twenty-eighth is this world except your eyes are getting pecked out by seagulls”, then I will just avoid the first switch. I realize that will intuitively feel like leaving some utility on the table—the first step in the chain just looks so much obviously better than the starting point—but I’m willing to make that sacrifice.
Scott doesn’t apply this to EA, but if you start from having to save a child, and the end point is “you have to sacrifice a lot to save as many children as you can”, this seems relevant.
I was scrolling for a while, assuming I’d neared the end, only to look at the position of the scrollbar and find I was barely 5% through! This must have taken a fair bit of effort. I really like the helpful page and I’m glad I know about it, I encourage you to make a linkpost for it sometime if you haven’t already.
I have another thousand or so of these, which I may just dump on a second page, unsorted, called The Gospel According to Insanity Wolf. That’s not counting the ones that I’ve decided are too extreme to publish at all. All drawn from life.
I was scrolling for a while, assuming I’d neared the end, only to look at the position of the scrollbar and find I was barely 5% through!
And there’s a meme for that too! The last in the Altruism section.
IS THIS NEARLY OVER YET?
LOOK AT THE SCROLL BAR!
YOU’VE HARDLY STARTED!
I’m sure there’s an argument to be made in defence of supererogation. I’ve never seen it though. People say “but demandingness” and Chad Singer replies Yes. My own faith in the boundedness of duty in both magnitude and distance is sufficient to not take even one step onto the slippery path that leads down only to the altruism event horizon that rips souls apart.
Here’s something I’ve linked to a number of times on LessWrong, but as a newcomer (I think, despite the lack of a green shoot by your name) you will likely not have seen it before. I wrote it, inspired by a lot of stuff I’ve seen. It approaches the topic of the post in a very different manner. I’d be interested in what answer you would make to Insanity Wolf. Beware: here be dragons.
Along these lines, Scott can be quoted:
Scott doesn’t apply this to EA, but if you start from having to save a child, and the end point is “you have to sacrifice a lot to save as many children as you can”, this seems relevant.
I was scrolling for a while, assuming I’d neared the end, only to look at the position of the scrollbar and find I was barely 5% through! This must have taken a fair bit of effort. I really like the helpful page and I’m glad I know about it, I encourage you to make a linkpost for it sometime if you haven’t already.
A labour of love. Or something. :)
I have another thousand or so of these, which I may just dump on a second page, unsorted, called The Gospel According to Insanity Wolf. That’s not counting the ones that I’ve decided are too extreme to publish at all. All drawn from life.
And there’s a meme for that too! The last in the Altruism section.
IS THIS NEARLY OVER YET?
LOOK AT THE SCROLL BAR!
YOU’VE HARDLY STARTED!
I’m sure there’s an argument to be made in defence of supererogation. I’ve never seen it though. People say “but demandingness” and Chad Singer replies Yes. My own faith in the boundedness of duty in both magnitude and distance is sufficient to not take even one step onto the slippery path that leads down only to the altruism event horizon that rips souls apart.