I migrated in the opposite direction—I double-majored in EE/CS as an undergraduate, then flirted with solid-state physics before settling on a mix of theoretical and applied math. Now I’m on leave working on a startup, my third.
Although I agree with most of what’s been said here so far, the discussion seems lopsided, so I’m assuming the role of devil’s advocate. For simplicity, I’m going to gloss over an idea I believe strongly [1] so I can make consistent use of terms already in use (e.g. “distinctions between” math and CS).
Points:
CS is a young field, so most of the theory that exists is still shallow. There’s nothing like, say, Lebesgue’s theory of measure and integration—an old, hard, important problem, done wrong for a long time, that was ultimately exhaustively solved in a way that created lots of practical applications. Obviously, taking integrals in spaces where you can’t do addition isn’t interesting for everyone. But most people who make it this far in math discover some of the inconsistencies in the naive theory for themselves, and then can follow the solutions to those problems as they emerged historically. You can learn a lot about problem solving in general by doing this. Most programmers don’t.
Related to the above, CS rewards thinking hard about specific cases (debugging) more than it rewards thinking hard about theory-building or problem-decomposition.
CS instruction is under pressure from powerful interests. The curriculum is heavily influenced by big companies who want to hire entry-level grads who have already been exposed to their way of doing things. They also attempt to direct research towards incremental improvement of existing technologies that will upset as few of the internal empire-builders as possible back at the home office. Manifestations in undergrad CS include the perverse fixation on Java as a teaching language, the dogged insistence that students who have never coded before do so according to certain standards of style, and the requirement that graduates be fluent in the latest fads^W developments in the theory of “Software Engineering.” [3] In math, there is very little cause to worry that the material you’ve been taught is bullshit intended to convert you into an epsilon’s worth of market share. [4]
[1] i.e. that math as practiced today, with its many colloquialisms-made-explicit [2] is what programming languages will look like in the future, when the long-sought “sufficiently smart compiler” is finally made available. In other words, doing math is running programs (“expressing ideas”) written in a language that targets the human brain.
[2] e.g. defining abstract entities like positive or negative infinity, adjoining them to the real numbers, and then claiming that integrals, when unbounded, are “equal to infinity.”
[3] This may sound like a rant, but it’s not. It turns out this kind of thing happens every time a new field gets profitable early. GE and Westinghouse more or less invented the occupation of electrical engineering with their subsidies to MIT. They needed technicians to do the upkeep on their power lines, and methods for maintaining stability in big electrical distribution networks. Most of the reference literature in organic chemistry is printed in German because German chemical companies hired armies of the new “PhD’s” in search of the next aniline dye. And even during Bell Labs’ postwar tenure as the high temple of American technology, a substantial portion of every issue of the Bell System Technical Journal was devoted to the subject of creosote.
[4] Sadly, the damage done here persists long after graduation, in the form of the “nobody ever got fired for using X” argument.
I migrated in the opposite direction—I double-majored in EE/CS as an undergraduate, then flirted with solid-state physics before settling on a mix of theoretical and applied math. Now I’m on leave working on a startup, my third.
Although I agree with most of what’s been said here so far, the discussion seems lopsided, so I’m assuming the role of devil’s advocate. For simplicity, I’m going to gloss over an idea I believe strongly [1] so I can make consistent use of terms already in use (e.g. “distinctions between” math and CS).
Points:
CS is a young field, so most of the theory that exists is still shallow. There’s nothing like, say, Lebesgue’s theory of measure and integration—an old, hard, important problem, done wrong for a long time, that was ultimately exhaustively solved in a way that created lots of practical applications. Obviously, taking integrals in spaces where you can’t do addition isn’t interesting for everyone. But most people who make it this far in math discover some of the inconsistencies in the naive theory for themselves, and then can follow the solutions to those problems as they emerged historically. You can learn a lot about problem solving in general by doing this. Most programmers don’t.
Related to the above, CS rewards thinking hard about specific cases (debugging) more than it rewards thinking hard about theory-building or problem-decomposition.
CS instruction is under pressure from powerful interests. The curriculum is heavily influenced by big companies who want to hire entry-level grads who have already been exposed to their way of doing things. They also attempt to direct research towards incremental improvement of existing technologies that will upset as few of the internal empire-builders as possible back at the home office. Manifestations in undergrad CS include the perverse fixation on Java as a teaching language, the dogged insistence that students who have never coded before do so according to certain standards of style, and the requirement that graduates be fluent in the latest fads^W developments in the theory of “Software Engineering.” [3] In math, there is very little cause to worry that the material you’ve been taught is bullshit intended to convert you into an epsilon’s worth of market share. [4]
[1] i.e. that math as practiced today, with its many colloquialisms-made-explicit [2] is what programming languages will look like in the future, when the long-sought “sufficiently smart compiler” is finally made available. In other words, doing math is running programs (“expressing ideas”) written in a language that targets the human brain.
[2] e.g. defining abstract entities like positive or negative infinity, adjoining them to the real numbers, and then claiming that integrals, when unbounded, are “equal to infinity.”
[3] This may sound like a rant, but it’s not. It turns out this kind of thing happens every time a new field gets profitable early. GE and Westinghouse more or less invented the occupation of electrical engineering with their subsidies to MIT. They needed technicians to do the upkeep on their power lines, and methods for maintaining stability in big electrical distribution networks. Most of the reference literature in organic chemistry is printed in German because German chemical companies hired armies of the new “PhD’s” in search of the next aniline dye. And even during Bell Labs’ postwar tenure as the high temple of American technology, a substantial portion of every issue of the Bell System Technical Journal was devoted to the subject of creosote.
[4] Sadly, the damage done here persists long after graduation, in the form of the “nobody ever got fired for using X” argument.