Do you have a picture of the poster that comes with a $40 pledge? Also, do you still get the poster if you pledge more?
pleeppleep
I’m the kid in the corner with the laptop
Meetup : Rutgers New Brunswick
Probably what I’ll end up doing. Just checking first is all.
Not sure if open thread is the best place to put this, but oh well.
I’m starting at Rutgers New Brunswick in a few weeks. There aren’t any regular meetups in that area, but I figure there have to be at least a few people around there who read lesswrong. If any of you see this I’d be really interested in getting in touch.
I suppose modafinil should be in the same boat as caffeine for the purposes of this experiment.
I cried twice reading this. That puts it just below Humanism part 3 on my list of most touching chapters.
Quirrel in Methods has pretty much stated that he’s trying to mold Harry into a dark lord. That requires Harry to be alive and is significantly more likely if he doesn’t have Hermione’s moral influence.
You will not be thrown in an asylum for discussing this with a professional
My experience disagrees. I went to see a professional for antidepressants, was emotionally stable at that moment, and was thrown in a psych ward for a week. I had to lie about my condition to be released. The whole affair failed to help in any way.
If my inhibitions regarding a certain course of action seem entirely internal, go through with it because I’m probably limiting my options for no good reason.
You would be correct. Thanks for the link.
Day = Made
How much money do you have to donate, if you don’t mind my asking?
Kinda awkward to say aloud. I think Institute for the Research of Machine Intelligence would sound better. Minor nitpick.
No
This is a question about utilitarianism, not AI, but can anyone explain (or provide a link to an explanation) of why reducing the total suffering in the world is considered so important? I thought that we pretty much agreed that morality is based on moral intuitions and it seems pretty counterintuitive to value the states of mind of people too numerous to sympathize with as highly as people here do.
It seems to me that reducing suffering in a numbers game is the kind of thing you would say is your goal because it makes you sound like a good person, rather than something your conscience actually motivates you to do, but people here are usually pretty averse to conscious signaling, so I’m not sure that works as an explanation. I’m certain this has been covered elsewhere, but I haven’t seen it.
You don’t have to be specific, but how would grossing out the gatekeeper bring you closer to escape?
Like when you say “horrible, horrible things”. What do you mean?
Driving a wedge between the gatekeeper and his or her loved ones? Threats? Exploiting any guilt or self-loathing the gatekeeper feels? Appealing to the gatekeeper’s sense of obligation by twisting his or her interpretation of authority figures, objects of admiration, and internalized sense of honor? Asserting cynicism and general apathy towards the fate of mankind?
For all but the last one it seems like you’d need an in-depth knowledge of the gatekeeper’s psyche and personal life.
But you wouldn’t actually be posting it, you would be posting the fact that you conceive it possible for someone to post it, which you’ve clearly already done.
I’m not so sure I followed that. Do you still get tickets as long as you pledge $25 or higher? Or if you want the poster and a ticket do you have to make 2 pledges totaling $65?