“Core curriculum” generally means “what you do that isn’t your major”. Marketable skills go there, not here; it does no one any good to produce a crop of students all of whom have taken two classes each in physics, comp sci, business, etc.
If you count the courses you suggest, there isn’t much room left for a major.
I think a fruitful avenue of thought here would be to consider higher (note the word) education in its historical context. Universities are very traditional places and historically they provided the education for the elite. Until historically recently education did not involve any marketable skills at all—its point was, as you said, “engaging with contemporary culture on an educated level”.
Four to six classes a year, out of about twelve in total? That doesn’t sound too bad to me. I took about that many non-major classes when I was in school, although they didn’t build on each other like the curriculum I proposed.
It may amuse you to note that I was basically designing that as a modernized liberal arts curriculum, with more emphasis on stats and econ and with some stuff (languages, music) stripped out to accommodate major courses. Obviously there’s some tension between the vocational and the liberal aims here, but I know enough people who e.g. got jobs at Google with philosophy degrees that I think there’s enough room for some of the latter.
Four to six classes a year, out of about twelve in total? That doesn’t sound too bad to me. I took about that many non-major classes when I was in school, although they didn’t build on each other like the curriculum I proposed.
I studied at two state universities. At both of them, classes were measured in “credit hours” corresponding to an hour of lecture per week. A regular class was three credit hours and semester loads at both universities were capped at eighteen credits, corresponding to six regular classes per semester and twelve regular classes per year (excluding summers). Few students took this maximal load, however. The minimum semester load for full-time students was twelve credit hours and sample degree plans tended to assume semester loads of fifteen credit hours, both of which were far more typical.
I know enough people who e.g. got jobs at Google with philosophy degree
Sure, but that’s evidence that they are unusually smart people. That’s not evidence that four years of college were useful for them.
As you probably know, there is a school of thought that treats college education as mostly signaling. Companies are willing to hire people from, say, the Ivies, because these people proved that they are sufficiently smart (by getting into an Ivy) and sufficiently conscientious (by graduating). What they learned during these four years is largely irrelevant.
Is four years of a “modernized liberal arts curriculum” the best use of four years of one’s life and a couple of hundred thousand dollars?
What counts as a ‘marketable skill’, or even what would be the baseline assumption of skill for becoming a fully and generally competent adult in twenty-first century society, might be very different from what was considered skill and competence in society 50 years ago. Rather than merely updating a liberal education as conceived in the Post-War era, might it make sense to redesign the liberal education from scratch? Like, does a Liberal Education 2.0 make sense?
What skills or competencies aren’t taught much in universities yet, but are ones everyone should learn?
That seems like a decent starting point. I don’t know my U.S. history to well, as I’m a young Canadian. However, a cursory glance at the Wikipedia page for the G.I. Bill in the U.S. reveals that it, among other benefits, effectively lowered the cost not only for veterans after World War II, but also their dependents. The G.I. Bill was still used through 1973, by Vietnam War veterans, so that’s millions more than I expected. As attending post-secondary school became normalized, it shifted toward the status quo for getting better jobs. In favor of equality, people of color and women also demanded equal opportunity to such education by having discriminatory acceptance policies and whatnot scrapped. This was successful to the extent that several million more Americans attended university.
So, a liberal education that was originally intended for upper(-middle) class individuals was seen as a rite of passage, for status, and then to stay competitive, for the ‘average American’. This trend extrapolated until the present. It doesn’t seem to me typical baccalaureate is optimized for what the economy needed for the 20th century, nor for what would maximize the chances of employment success for individuals. I don’t believe this is true for some STEM degrees, of course. Nonetheless, if there are jobs for the 21st century that don’t yet exist, we’re not well-equipped for those either, because we’re not even equipped for the education needed for the jobs of the present.
I hope the history overview wasn’t redundant, but I wanted an awareness of design flaws of the current education system before thinking about a new one. Not that we’re designing anything for real here, but it’s interesting to spitball ideas.
If not already in high school, universities might mandate a course on coding, or at least how to navigate information and data better, the same way almost all degrees mandate a course in English or communications in the first year. It seems ludicrous this isn’t already standard, and careers will involve only more understanding of computing in the future.. There needs to be a way to make the basics of information science intelligible for everyone, like literacy, and pre-calculus.
There’s an unsettled debate about whether studying the humanities increases critical thinking skills or not. Maybe the debate is settled, but I can’t tell the signal from the noise in that regard. To be cautious, rather than removing the humanities entirely, maybe a class can be generated that gets students thinking rhetorically and analytically with words, but is broader or more topical than the goings-on of Ancient Greece.
These are obvious and weak suggestions I’ve made. I don’t believe I can predict the future well, because I don’t know where to start researching what the careers and jobs of the 21st century will be like.
“Core curriculum” generally means “what you do that isn’t your major”. Marketable skills go there, not here; it does no one any good to produce a crop of students all of whom have taken two classes each in physics, comp sci, business, etc.
If you count the courses you suggest, there isn’t much room left for a major.
I think a fruitful avenue of thought here would be to consider higher (note the word) education in its historical context. Universities are very traditional places and historically they provided the education for the elite. Until historically recently education did not involve any marketable skills at all—its point was, as you said, “engaging with contemporary culture on an educated level”.
Four to six classes a year, out of about twelve in total? That doesn’t sound too bad to me. I took about that many non-major classes when I was in school, although they didn’t build on each other like the curriculum I proposed.
It may amuse you to note that I was basically designing that as a modernized liberal arts curriculum, with more emphasis on stats and econ and with some stuff (languages, music) stripped out to accommodate major courses. Obviously there’s some tension between the vocational and the liberal aims here, but I know enough people who e.g. got jobs at Google with philosophy degrees that I think there’s enough room for some of the latter.
I studied at two state universities. At both of them, classes were measured in “credit hours” corresponding to an hour of lecture per week. A regular class was three credit hours and semester loads at both universities were capped at eighteen credits, corresponding to six regular classes per semester and twelve regular classes per year (excluding summers). Few students took this maximal load, however. The minimum semester load for full-time students was twelve credit hours and sample degree plans tended to assume semester loads of fifteen credit hours, both of which were far more typical.
Sure, but that’s evidence that they are unusually smart people. That’s not evidence that four years of college were useful for them.
As you probably know, there is a school of thought that treats college education as mostly signaling. Companies are willing to hire people from, say, the Ivies, because these people proved that they are sufficiently smart (by getting into an Ivy) and sufficiently conscientious (by graduating). What they learned during these four years is largely irrelevant.
Is four years of a “modernized liberal arts curriculum” the best use of four years of one’s life and a couple of hundred thousand dollars?
What counts as a ‘marketable skill’, or even what would be the baseline assumption of skill for becoming a fully and generally competent adult in twenty-first century society, might be very different from what was considered skill and competence in society 50 years ago. Rather than merely updating a liberal education as conceived in the Post-War era, might it make sense to redesign the liberal education from scratch? Like, does a Liberal Education 2.0 make sense?
What skills or competencies aren’t taught much in universities yet, but are ones everyone should learn?
Perhaps we need to re-think what jobs and employment look like in the 21st century and build from there?
That seems like a decent starting point. I don’t know my U.S. history to well, as I’m a young Canadian. However, a cursory glance at the Wikipedia page for the G.I. Bill in the U.S. reveals that it, among other benefits, effectively lowered the cost not only for veterans after World War II, but also their dependents. The G.I. Bill was still used through 1973, by Vietnam War veterans, so that’s millions more than I expected. As attending post-secondary school became normalized, it shifted toward the status quo for getting better jobs. In favor of equality, people of color and women also demanded equal opportunity to such education by having discriminatory acceptance policies and whatnot scrapped. This was successful to the extent that several million more Americans attended university.
So, a liberal education that was originally intended for upper(-middle) class individuals was seen as a rite of passage, for status, and then to stay competitive, for the ‘average American’. This trend extrapolated until the present. It doesn’t seem to me typical baccalaureate is optimized for what the economy needed for the 20th century, nor for what would maximize the chances of employment success for individuals. I don’t believe this is true for some STEM degrees, of course. Nonetheless, if there are jobs for the 21st century that don’t yet exist, we’re not well-equipped for those either, because we’re not even equipped for the education needed for the jobs of the present.
I hope the history overview wasn’t redundant, but I wanted an awareness of design flaws of the current education system before thinking about a new one. Not that we’re designing anything for real here, but it’s interesting to spitball ideas.
If not already in high school, universities might mandate a course on coding, or at least how to navigate information and data better, the same way almost all degrees mandate a course in English or communications in the first year. It seems ludicrous this isn’t already standard, and careers will involve only more understanding of computing in the future.. There needs to be a way to make the basics of information science intelligible for everyone, like literacy, and pre-calculus.
There’s an unsettled debate about whether studying the humanities increases critical thinking skills or not. Maybe the debate is settled, but I can’t tell the signal from the noise in that regard. To be cautious, rather than removing the humanities entirely, maybe a class can be generated that gets students thinking rhetorically and analytically with words, but is broader or more topical than the goings-on of Ancient Greece.
These are obvious and weak suggestions I’ve made. I don’t believe I can predict the future well, because I don’t know where to start researching what the careers and jobs of the 21st century will be like.
Persuasive writing and speaking. Alternatively, interesting writing and speaking.