Hello, all. I’m Joe. I’m 43, currently a graduate student in computational biology (in which I am discovering that a lot of inference techniques in biology are based on Bayes’s Theorem). I’m also a professional software developer, and have been writing software for most of my life (since about age 10). In the early 1990′s I was a graduate student at the AI lab at the University of Georgia, and though I didn’t finish that degree, I learned a lot of stuff that was of great utility in my career in software development—among other things, I learned about a number of different heuristics and their failure modes.
I remember a moment early in my professional career when I was trying to convince someone that some bug wasn’t my fault, but was a bug in a third-party library. I very suddenly realized that, in fact, the problem was overwhelmingly more likely to be in my code than in the libraries and other tools we used, tools which were exercised daily by hundreds of thousands of developers. In that instant, I become much more skeptical of my own ability to do things Right. I think that moment was the start of my journey as a rationalist. I haven’t thought about that process in a systematic way, though, until recently.
I’ve known of LW for quite a while, but really got interested when lukeprog of http://commonsenseatheism.com started reading Eliezer’s posts sequentially. I’m now reading the sequences somewhat chaotically; I’ve read around 30% of the sequence posts.
My fear is, no matter how far I progress as a rationalist, I’ll still be doing it Wrong. Or I’ll still fear that I’m doing it wrong. I think I suffer greatly from under-confidence http://lesswrong.com/lw/c3/the_sin_of_underconfidence/ , and I’m very risk-averse. A property which I’ve just lately begun to view as a liability.
I am coming to view formal probabilistic reasoning as of fundamental importance to understanding reality, and I’d like to learn all I can about it.
If I overcome my reluctance to be judged by this community, I might write about my experiences with education in the US, which I believe ill-serves many of its clients. I have a 14-year-old daughter who is “unschooled”. The topics of raising children as rationalists, and rational parenting, could engender some valuable discussions.
I might write about how, as an atheist, I’ve found it practically useful to belong to a religious community (a Unitarian Universalist church). “Believing in” religion is obviously irrational, but being connected with a religious community can in some circumstances be a rational, and non-cynical, move.
I might also write about software debugging as a rational activity. Though that’s kind of obvious, I guess. OTOH debugging is IMO a severely under-valued skill in the field of software development. Most of my work is in soft real-time systems, which requires a whole different approach to debugging than interactive/GUI/web application development.
I might write about my own brief bout with mental illness, and about the process of dealing with a severely mentally-ill close relative, from a rationalist perspective.
My favorite sentence on LW so far: “Rationalists should WIN.”
If you have the time and inclination to test this, you can use this site to discover your level of under- or over-confidence, and adjust appropriately.
In any case, welcome to LessWrong! I look forward especially to hearing about the process of unschooling; there is (very rightly) an impression here on LessWrong that raising a child is one of the hardest tasks; it seems like also taking responsibility for their education is even more daunting!
Hello, all. I’m Joe. I’m 43, currently a graduate student in computational biology (in which I am discovering that a lot of inference techniques in biology are based on Bayes’s Theorem). I’m also a professional software developer, and have been writing software for most of my life (since about age 10). In the early 1990′s I was a graduate student at the AI lab at the University of Georgia, and though I didn’t finish that degree, I learned a lot of stuff that was of great utility in my career in software development—among other things, I learned about a number of different heuristics and their failure modes.
I remember a moment early in my professional career when I was trying to convince someone that some bug wasn’t my fault, but was a bug in a third-party library. I very suddenly realized that, in fact, the problem was overwhelmingly more likely to be in my code than in the libraries and other tools we used, tools which were exercised daily by hundreds of thousands of developers. In that instant, I become much more skeptical of my own ability to do things Right. I think that moment was the start of my journey as a rationalist. I haven’t thought about that process in a systematic way, though, until recently.
I’ve known of LW for quite a while, but really got interested when lukeprog of http://commonsenseatheism.com started reading Eliezer’s posts sequentially. I’m now reading the sequences somewhat chaotically; I’ve read around 30% of the sequence posts.
My fear is, no matter how far I progress as a rationalist, I’ll still be doing it Wrong. Or I’ll still fear that I’m doing it wrong. I think I suffer greatly from under-confidence http://lesswrong.com/lw/c3/the_sin_of_underconfidence/ , and I’m very risk-averse. A property which I’ve just lately begun to view as a liability.
I am coming to view formal probabilistic reasoning as of fundamental importance to understanding reality, and I’d like to learn all I can about it.
If I overcome my reluctance to be judged by this community, I might write about my experiences with education in the US, which I believe ill-serves many of its clients. I have a 14-year-old daughter who is “unschooled”. The topics of raising children as rationalists, and rational parenting, could engender some valuable discussions.
I might write about how, as an atheist, I’ve found it practically useful to belong to a religious community (a Unitarian Universalist church). “Believing in” religion is obviously irrational, but being connected with a religious community can in some circumstances be a rational, and non-cynical, move.
I might also write about software debugging as a rational activity. Though that’s kind of obvious, I guess. OTOH debugging is IMO a severely under-valued skill in the field of software development. Most of my work is in soft real-time systems, which requires a whole different approach to debugging than interactive/GUI/web application development.
I might write about my own brief bout with mental illness, and about the process of dealing with a severely mentally-ill close relative, from a rationalist perspective.
My favorite sentence on LW so far: “Rationalists should WIN.”
If you have the time and inclination to test this, you can use this site to discover your level of under- or over-confidence, and adjust appropriately.
In any case, welcome to LessWrong! I look forward especially to hearing about the process of unschooling; there is (very rightly) an impression here on LessWrong that raising a child is one of the hardest tasks; it seems like also taking responsibility for their education is even more daunting!