That seems like a decent starting point. I don’t know my U.S. history to well, as I’m a young Canadian. However, a cursory glance at the Wikipedia page for the G.I. Bill in the U.S. reveals that it, among other benefits, effectively lowered the cost not only for veterans after World War II, but also their dependents. The G.I. Bill was still used through 1973, by Vietnam War veterans, so that’s millions more than I expected. As attending post-secondary school became normalized, it shifted toward the status quo for getting better jobs. In favor of equality, people of color and women also demanded equal opportunity to such education by having discriminatory acceptance policies and whatnot scrapped. This was successful to the extent that several million more Americans attended university.
So, a liberal education that was originally intended for upper(-middle) class individuals was seen as a rite of passage, for status, and then to stay competitive, for the ‘average American’. This trend extrapolated until the present. It doesn’t seem to me typical baccalaureate is optimized for what the economy needed for the 20th century, nor for what would maximize the chances of employment success for individuals. I don’t believe this is true for some STEM degrees, of course. Nonetheless, if there are jobs for the 21st century that don’t yet exist, we’re not well-equipped for those either, because we’re not even equipped for the education needed for the jobs of the present.
I hope the history overview wasn’t redundant, but I wanted an awareness of design flaws of the current education system before thinking about a new one. Not that we’re designing anything for real here, but it’s interesting to spitball ideas.
If not already in high school, universities might mandate a course on coding, or at least how to navigate information and data better, the same way almost all degrees mandate a course in English or communications in the first year. It seems ludicrous this isn’t already standard, and careers will involve only more understanding of computing in the future.. There needs to be a way to make the basics of information science intelligible for everyone, like literacy, and pre-calculus.
There’s an unsettled debate about whether studying the humanities increases critical thinking skills or not. Maybe the debate is settled, but I can’t tell the signal from the noise in that regard. To be cautious, rather than removing the humanities entirely, maybe a class can be generated that gets students thinking rhetorically and analytically with words, but is broader or more topical than the goings-on of Ancient Greece.
These are obvious and weak suggestions I’ve made. I don’t believe I can predict the future well, because I don’t know where to start researching what the careers and jobs of the 21st century will be like.
That seems like a decent starting point. I don’t know my U.S. history to well, as I’m a young Canadian. However, a cursory glance at the Wikipedia page for the G.I. Bill in the U.S. reveals that it, among other benefits, effectively lowered the cost not only for veterans after World War II, but also their dependents. The G.I. Bill was still used through 1973, by Vietnam War veterans, so that’s millions more than I expected. As attending post-secondary school became normalized, it shifted toward the status quo for getting better jobs. In favor of equality, people of color and women also demanded equal opportunity to such education by having discriminatory acceptance policies and whatnot scrapped. This was successful to the extent that several million more Americans attended university.
So, a liberal education that was originally intended for upper(-middle) class individuals was seen as a rite of passage, for status, and then to stay competitive, for the ‘average American’. This trend extrapolated until the present. It doesn’t seem to me typical baccalaureate is optimized for what the economy needed for the 20th century, nor for what would maximize the chances of employment success for individuals. I don’t believe this is true for some STEM degrees, of course. Nonetheless, if there are jobs for the 21st century that don’t yet exist, we’re not well-equipped for those either, because we’re not even equipped for the education needed for the jobs of the present.
I hope the history overview wasn’t redundant, but I wanted an awareness of design flaws of the current education system before thinking about a new one. Not that we’re designing anything for real here, but it’s interesting to spitball ideas.
If not already in high school, universities might mandate a course on coding, or at least how to navigate information and data better, the same way almost all degrees mandate a course in English or communications in the first year. It seems ludicrous this isn’t already standard, and careers will involve only more understanding of computing in the future.. There needs to be a way to make the basics of information science intelligible for everyone, like literacy, and pre-calculus.
There’s an unsettled debate about whether studying the humanities increases critical thinking skills or not. Maybe the debate is settled, but I can’t tell the signal from the noise in that regard. To be cautious, rather than removing the humanities entirely, maybe a class can be generated that gets students thinking rhetorically and analytically with words, but is broader or more topical than the goings-on of Ancient Greece.
These are obvious and weak suggestions I’ve made. I don’t believe I can predict the future well, because I don’t know where to start researching what the careers and jobs of the 21st century will be like.