I disagree with both of these methods. If EY were 100% sure, and NB were 50% sure, then I think the entire 20 should go to EY, and neither of the two methods have this property. I am very interested in trying to figure out what the best formula for this situation is, but I do not yet know. Here is a proposal:
Take the least amount of evidence so that you can shift both predictions by this amount of evidence, to make them the same, and split according to this probability.
Is this algorithm good?
Hello Less Wrong! I am Scott Garrabrant, a 23 year old math PhD student at UCLA, studying combinatorics. I discovered Less Wrong about 4 months ago. After reading MoR and a few sequences, I decided to go back and read every blog post. (I just finished all Eliezer’s OB posts) I was going to wait and start posting after I got completely caught up, but then I started attending weekly meetups 2 months ago, and now I need to earn enough karma to make meetup announcements.
I have been interested in meta-thinking for a long time. I have spent a lot of time thinking about the nature of rationality, purely out of curiosity, and have independently made many of the same conclusions I have found on this blog. I believe that I realized that decision/probability theory was the correct language to talk about rationality in high school about 6 years ago. It has made me very happy to learn that there are so many like-minded people.
However, there has been one mistake I have been making for a long time. I have been giving other people too much respect in their rationality. I have been treating other people as almost rational agents with different utility functions and very different prior probabilities. This blog has taught me how wrong that view was, which is causing me to rethink some of my prior views.
One thing would like some help in deciding right now is about Unitarian Universalism. I would love it if any rationalists who know anything about Unitarianism (or who don’t) could help me out. I am agnostic (If you define the god hypothesis to include the simulation hypothesis, atheist otherwise). I believe that most of the bad parts of religion and theism come from the fact that they tend to encourage irrationality. So far, my picture of the average Unitarian is above average rationality, but not great. The main thing that attracts me to the group is that they (at least claim to) promote “a free and responsible search for truth and meaning.” Their search algorithms could really use some work, but they both view truth as a goal and understand that they have not attained it completely. In looking for a local community to provide “brownies and babysitters,” it seems to be the best I have found. Also, although I do not have a “god shaped hole” that needs to be filled, I understand that many people do, and so I can see that it might be good to support an organization that will help to allow those people to fill that hole with something that does not encourage irrationality. On the other hand, sometimes I feel like Unitarians care a lot a lot more about the “free” part of “free and responsible search for truth and meaning” than the “responsible” part. I am worried that they like to discuss their individual beliefs as they would discuss their favorite colors, and never actually change. Maybe with our current messed up society, the first step is for people to feel free to believe what they want, and then learn how to be critical.
In attending Unitarian churches, I have repeatedly enjoyed myself, thought about interesting philosophy (even though I often disagree with the sermon), had sufficiently strong emotional responses from the music (e.g. “imagine”), and been encouraged by how much people were willing to help each other. I already know that I enjoy experience. What I am trying to decide is morally if I should be willing to support this organization. For the future, I am also trying to decide if I should be worried that being around this kind of thinking might be bad for my future kids.