I think this is a question of what satisfies your curiosity. Neither of the examples you give is a paradigm example of curiosity as such—curiosity is generally taken to mean a desire for knowledge for its own sake whereas both your examples involve seeking knowledge for practical reasons—but perhaps in your case these work and personal issues are enough to satisfy your curiosity. The fact that you’re here makes me think otherwise though. Surely you read LessWrong out of curiosity?
Yes, I do seek knowledge for other reasons, here and elsewhere. But my expectation that this will not “look like” curiosity because I expect to have few changes in my behavior based on what I read, and so the importance of it being “true” is likewise diminished. Sure, I would like to have my beliefs about the brain and AI be true, but I’m not prepared to spend a LOT of resources to do it—I’m sure if I were really curious about the role of Oxytocin in relationships, I could reach true beliefs faster by spending more resources. There are gradations between “French paintings” and “database performance” in how curious I am about things, I agree, and most of Less Wrong falls somewhere in the middle of it. The curiosity Luke was alluding to is the all consuming curiosity of “things I expect belief accuracy to have large impact on my utility”, which I doubt most of Less Wrong falls on.
I think this is a question of what satisfies your curiosity. Neither of the examples you give is a paradigm example of curiosity as such—curiosity is generally taken to mean a desire for knowledge for its own sake whereas both your examples involve seeking knowledge for practical reasons—but perhaps in your case these work and personal issues are enough to satisfy your curiosity. The fact that you’re here makes me think otherwise though. Surely you read LessWrong out of curiosity?
Yes, I do seek knowledge for other reasons, here and elsewhere. But my expectation that this will not “look like” curiosity because I expect to have few changes in my behavior based on what I read, and so the importance of it being “true” is likewise diminished. Sure, I would like to have my beliefs about the brain and AI be true, but I’m not prepared to spend a LOT of resources to do it—I’m sure if I were really curious about the role of Oxytocin in relationships, I could reach true beliefs faster by spending more resources. There are gradations between “French paintings” and “database performance” in how curious I am about things, I agree, and most of Less Wrong falls somewhere in the middle of it. The curiosity Luke was alluding to is the all consuming curiosity of “things I expect belief accuracy to have large impact on my utility”, which I doubt most of Less Wrong falls on.