an instinctive reaction of the type “you are so full of it” to any poorly supported extravagant claim has a fancy name “Bayesian adjustment” and so can be trusted
singularity research is one such claim
GiveWell’s charity metrics penalize charities with high uncertainties
Give to GiveWell if you want to be sure your donation is not wasted
Edit: not saying that I agree with this, just checking if my understanding is not off-base.
Actually, I had a negative reaction to this comment for the opposite reason- it seemed overly critical of the post. The first point seemed to be ignoring a fair amount of his argument, and instead focusing on criticizing what he named his method; the last point seemed to me to be impugning Holden’s motives based off something he never actually said.
Did I summarize your point correctly:
an instinctive reaction of the type “you are so full of it” to any poorly supported extravagant claim has a fancy name “Bayesian adjustment” and so can be trusted
singularity research is one such claim
GiveWell’s charity metrics penalize charities with high uncertainties
Give to GiveWell if you want to be sure your donation is not wasted
Edit: not saying that I agree with this, just checking if my understanding is not off-base.
Actually, I had a negative reaction to this comment for the opposite reason- it seemed overly critical of the post. The first point seemed to be ignoring a fair amount of his argument, and instead focusing on criticizing what he named his method; the last point seemed to me to be impugning Holden’s motives based off something he never actually said.
thanks!
explaining the math behind our instincts is usually a worthy goal. you call it “bayesian” because it is, of course, bayesian.
Actually, they advocate that you should give to charities that both score highly on their metrics and pursue some goal that you yourself find worthy.
See also GiveWell’s Do-It-Yourself Charity Evaluation Questions.