What, you mean in mainstream philosophy? I don’t think mainstream philosophers think that way, even Quineans. The best ones would say gravely, “Yes, goals are important” and then have a big debate with the rest of the field about whether goals are important or not. Luke is welcome to prove me wrong about that.
I actually don’t think this is about right. Last time I asked a philosopher about this, they pointed to an article by someone (I.J. Good, I think) about how to choose the most valuable experiment (given your goals), using decision theory.
AI research is where to look in regards to your question, SarahC. Start with chapter 2 and the chapters with ‘decisions’ in the title in AI: A Modern Approach.
That was my intuition. Just wanted to know if there’s more out there.
What, you mean in mainstream philosophy? I don’t think mainstream philosophers think that way, even Quineans. The best ones would say gravely, “Yes, goals are important” and then have a big debate with the rest of the field about whether goals are important or not. Luke is welcome to prove me wrong about that.
I actually don’t think this is about right. Last time I asked a philosopher about this, they pointed to an article by someone (I.J. Good, I think) about how to choose the most valuable experiment (given your goals), using decision theory.
Yes, that’s about right.
AI research is where to look in regards to your question, SarahC. Start with chapter 2 and the chapters with ‘decisions’ in the title in AI: A Modern Approach.
Thank you!