I am going to respond to the general overall direction of your responses.
That is feeble, and for those who don’t understand why let me explain it.
Eliezer works for SIAI which is a non-profit where his pay depends on donations. Many people on LW are interested in SIAI and some even donate to SIAI, others potentially could donate. When your pay depends on convincing people that your work is worthwhile it is always worth justifying what you are doing. This becomes even more important when it looks like you’re distracted from what you are being paid to do. (If you ever work with a VC and their money you’ll know what I mean.)
When it comes to ensuring that SIAI continues to pay especially when you are the FAI researcher there justifying why you are writing a book on rationality which in no way solves FAI becomes extremely important.
EY ask yourself this what percent of the people interested in SIAI and donate are interested FAI? Then ask what percent are interested in rationality with no clear plan of how that gets to FAI? If the answer to the first is greater then the second then you have a big problem, because one could interpret the use of your time writing this book on rationality as wasting donated money unless there is a clear reason how rationality books get you to FAI.
P.S. If you want to educate people to help you out as someone speculated you’d be better off teaching them computer science and mathematics.
Remember my post drew no conclusions so for Yvain I have cast no stones I merely ask questions.
If you want to educate people to help you out as someone speculated you’d be better off teaching them computer science and mathematics.
Even on the margin? There are already lots of standard textbooks and curricula for mathematics and computer science, whereas I’m not aware of anything else that fills the function of Less Wrong.
If you are previously a donor to SIAI, I’ll be happy to answer you elsewhere.
If not, I am not interested in what you think SIAI donors think. Given your other behavior, I’m also not interested in any statements on your part that you might donate if only circumstances were X. Experience tells me better.
“If not, I am not interested in what you think SIAI donors think.”
I never claimed to know what SIAI donors think I asked you to think about that. But I think the fact that SIAI has as little money as it does after all these years speaks volumes about SIAI.
“Given your other behavior, ”
Why because I ask questions that when answered honestly you don’t like? Or is it because I don’t blindly hang on every word you speak?
“I’m also not interested in any statements on your part that you might donate if only circumstances were X. Experience tells me better.”
I never claimed I would donate nor will I ever as long as I live. As for experience telling you better, you have none, and considering the lack of money SIAI has and your arrogance you probably never will so I will keep my own council on that part.
“If you are previously a donor to SIAI, I’ll be happy to answer you elsewhere.”
Why, because you don’t want to disrupt the LW image of Eliezer the genius? Or is it because you really are distracted as I suspect or have given up because you cannot solve the problem of FAI another good possibility? These questions are simple easy to answer and I see no real reason you can’t answer them here and now. If you find the answers embarrassing then change, if not then what have you got to loose?
If your next response is as feeble as the last ones have been don’t bother posting them for my sake. You claim you want to be a rationalist then try applying reason to your own actions and answer the questions asked honestly.
why you are writing a book on rationality which in no way solves FAI
Rationality is the art of not screwing up—seeing what is there instead of what you want to see, or are evolutionarily suspectible to seeing. When working on a task that may have (literally) earth-shattering consequences, there may not be a skill that’s more important. Getting people educated about rationality is of prime importance for FAI.
I am going to respond to the general overall direction of your responses.
That is feeble, and for those who don’t understand why let me explain it.
Eliezer works for SIAI which is a non-profit where his pay depends on donations. Many people on LW are interested in SIAI and some even donate to SIAI, others potentially could donate. When your pay depends on convincing people that your work is worthwhile it is always worth justifying what you are doing. This becomes even more important when it looks like you’re distracted from what you are being paid to do. (If you ever work with a VC and their money you’ll know what I mean.)
When it comes to ensuring that SIAI continues to pay especially when you are the FAI researcher there justifying why you are writing a book on rationality which in no way solves FAI becomes extremely important.
EY ask yourself this what percent of the people interested in SIAI and donate are interested FAI? Then ask what percent are interested in rationality with no clear plan of how that gets to FAI? If the answer to the first is greater then the second then you have a big problem, because one could interpret the use of your time writing this book on rationality as wasting donated money unless there is a clear reason how rationality books get you to FAI.
P.S. If you want to educate people to help you out as someone speculated you’d be better off teaching them computer science and mathematics.
Remember my post drew no conclusions so for Yvain I have cast no stones I merely ask questions.
Even on the margin? There are already lots of standard textbooks and curricula for mathematics and computer science, whereas I’m not aware of anything else that fills the function of Less Wrong.
If you are previously a donor to SIAI, I’ll be happy to answer you elsewhere.
If not, I am not interested in what you think SIAI donors think. Given your other behavior, I’m also not interested in any statements on your part that you might donate if only circumstances were X. Experience tells me better.
I apologize I rippled your pond.
“If not, I am not interested in what you think SIAI donors think.”
I never claimed to know what SIAI donors think I asked you to think about that. But I think the fact that SIAI has as little money as it does after all these years speaks volumes about SIAI.
“Given your other behavior, ”
Why because I ask questions that when answered honestly you don’t like? Or is it because I don’t blindly hang on every word you speak?
“I’m also not interested in any statements on your part that you might donate if only circumstances were X. Experience tells me better.”
I never claimed I would donate nor will I ever as long as I live. As for experience telling you better, you have none, and considering the lack of money SIAI has and your arrogance you probably never will so I will keep my own council on that part.
“If you are previously a donor to SIAI, I’ll be happy to answer you elsewhere.”
Why, because you don’t want to disrupt the LW image of Eliezer the genius? Or is it because you really are distracted as I suspect or have given up because you cannot solve the problem of FAI another good possibility? These questions are simple easy to answer and I see no real reason you can’t answer them here and now. If you find the answers embarrassing then change, if not then what have you got to loose?
If your next response is as feeble as the last ones have been don’t bother posting them for my sake. You claim you want to be a rationalist then try applying reason to your own actions and answer the questions asked honestly.
The questions are fine. I think it’s the repetitiveness, obvious hostility, and poor grammar and spelling that get on people’s nerves.
Not to mention barely responding to repeated presentations of the correct answer.
...and it being a top-level post instead of Open Thread comment. Probably would’ve been a lot more forgiving if it’d been an Open Thread comment.
You’re that sure that at this point in time you have all the information you’d ever need to make that decision?
Rationality is the art of not screwing up—seeing what is there instead of what you want to see, or are evolutionarily suspectible to seeing. When working on a task that may have (literally) earth-shattering consequences, there may not be a skill that’s more important. Getting people educated about rationality is of prime importance for FAI.