Certainly, I understand this science vs. engineering, pure vs. applied, fundamental vs. emergent, theoretical vs. computational vs. observational/experimental classification is fuzzy: relevant xkcd, smbc. Hell, even the math vs. physics vs. chemistry vs. biology distinctions are fuzzy!
What I am saying is that either your definition has to be so narrow as to exclude most of what is generally considered “science,” (à la Rutherford, the ironically Chemistry Nobel Laureate) or you need to exclude AI via special pleading. Specifically, my claim is that AI research is closer to physics (the simulations/computation end) than chemistry is. Admittedly, this claim is based on vibes, but if pressed, I could probably point to how many people transition from one field to the other.
Hmm, in that case maybe I misunderstood the post, my impression wasnt that he was saying AI literally isn’t a science anymore, but more that engineering work is getting too far ahead of the science part—and that in practice most ML progress now is just ML Engineering, where understanding is only a means to an end (and so is not as deep as it would be if it was science first).
I would guess that engineering gets ahead of science pretty often, but maybe in ML it’s more pronounced—hype/money investment, as well as perhaps the perceived relative low stakes (unlike aerospace, or medical robotics which is my field) not scaring the ML engineers enough to actually care about deep understanding, and also perhaps the inscrutable nature of ML—if it were easy to understand, it wouldn’t be as unappealing spend resources to do so.
I don’t really have a take on where the in elegance comes in to play here
Certainly, I understand this science vs. engineering, pure vs. applied, fundamental vs. emergent, theoretical vs. computational vs. observational/experimental classification is fuzzy: relevant xkcd, smbc. Hell, even the math vs. physics vs. chemistry vs. biology distinctions are fuzzy!
What I am saying is that either your definition has to be so narrow as to exclude most of what is generally considered “science,” (à la Rutherford, the ironically Chemistry Nobel Laureate) or you need to exclude AI via special pleading. Specifically, my claim is that AI research is closer to physics (the simulations/computation end) than chemistry is. Admittedly, this claim is based on vibes, but if pressed, I could probably point to how many people transition from one field to the other.
Hmm, in that case maybe I misunderstood the post, my impression wasnt that he was saying AI literally isn’t a science anymore, but more that engineering work is getting too far ahead of the science part—and that in practice most ML progress now is just ML Engineering, where understanding is only a means to an end (and so is not as deep as it would be if it was science first).
I would guess that engineering gets ahead of science pretty often, but maybe in ML it’s more pronounced—hype/money investment, as well as perhaps the perceived relative low stakes (unlike aerospace, or medical robotics which is my field) not scaring the ML engineers enough to actually care about deep understanding, and also perhaps the inscrutable nature of ML—if it were easy to understand, it wouldn’t be as unappealing spend resources to do so.
I don’t really have a take on where the in elegance comes in to play here