So my particular position is that I’m not giving to SIAI until I’m worth enough financially that I can ask a few hours of Eliezer’s time, and get a better idea of whether the theories are correct.
I don’t think this matches up with your rejection. Even if you were an expert in the fields Eliezer is working in, it sounds like that wouldn’t give you the ability to give any of his ideas a positive seal of approval, since many people worked on ideas for long times without seeing what was wrong with them. It also seems like a few hours to hash out disagreements is a very low estimate. How long do you think Eliezer and Robin Hanson have spent debating their theories, while becoming no closer to resolution?
The scenario you paint- that you get rich enough for Eliezer to wager a few hours of his time on reassuring you- does not sound like one designed to determine the correctness of the theories instead of giving you as much emotional satisfaction as possible.
I should make clear I do not mean to condemn, rather to provoke introspection; it is not clear to me there is a reason to support SIAI or other charities beyond emotional satisfaction, and so it may be wise to pursue opportunities like this without being explicit that’s the compensation you expect from charities.
Clearly a few hours wouldn’t be enough for me to get a level of knowledge comparable to experts. It could definitely move my probability estimate a lot.
I don’t think this matches up with your rejection. Even if you were an expert in the fields Eliezer is working in, it sounds like that wouldn’t give you the ability to give any of his ideas a positive seal of approval, since many people worked on ideas for long times without seeing what was wrong with them. It also seems like a few hours to hash out disagreements is a very low estimate. How long do you think Eliezer and Robin Hanson have spent debating their theories, while becoming no closer to resolution?
The scenario you paint- that you get rich enough for Eliezer to wager a few hours of his time on reassuring you- does not sound like one designed to determine the correctness of the theories instead of giving you as much emotional satisfaction as possible.
I should make clear I do not mean to condemn, rather to provoke introspection; it is not clear to me there is a reason to support SIAI or other charities beyond emotional satisfaction, and so it may be wise to pursue opportunities like this without being explicit that’s the compensation you expect from charities.
Clearly a few hours wouldn’t be enough for me to get a level of knowledge comparable to experts. It could definitely move my probability estimate a lot.