Teaching CS During Take-Off
I stayed up too late collecting way-past-deadline papers and writing report cards. When I woke up at 6, this anxious email from one of my g11 Computer Science students was already in my Inbox.
Student: Hello Mr. Carle, I hope you’ve slept well; I haven’t.
I’ve been seeing a lot of new media regarding how developed AI has become in software programming, most relevantly videos about NVIDIA’s new artificial intelligence software developer, Devin.
Things like these are almost disheartening for me to see as I try (and struggle) to get better at coding and developing software. It feels like I’ll never use the information that I learn in your class outside of high school because I can just ask an AI to write complex programs, and it will do it much faster than I would.
I’d like to know what your thoughts on this are. Do you think AI will replace human software developers, as NVIDIA claims it will?
My response: Buddy, that is a big question for 5:15 am.
First AI horizon thoughts:
Software development as a field will look incredibly different in 10 years.
My priors say that MOST of human intellectual+economic activity will ALSO be radically different in 10 years.
I have a very small p(doom) for the 10 year horizon. That means I don’t expect human-equivalent AGIs to completely disrupt human civilisation within 10 years.
The delta between how fast AI will affect software engineering and how fast AI will transform other (roughly speaking) white collar careers is relatively small. That means I think the AI affect on say, hedge fund management and software engineering to be similar.
Then some priors I have for teaching IB Computer Science in the middle of this take-off:
I don’t think becoming a software engineer is the modal outcome for IBCS students
I believe that most long term personal utility from IBCS (or any other intro CS exposure) comes from shifting a student’s mental model of how the modern social and economic system interacts with / depends on these technologies.
While the modern Ai tools are light years beyond the simple Von Neumann CPU models and intro Python we’re studying, it does address the foundations of those systems. Similarly, HL Analysis and HL Physics don’t cover anything about the math and physics that underpin these huge ML systems, but that foundation IS there. You can’t approach the superstructure without it.
So, in summary, if your concern is “the world seems to be changing fast. This class is hard, and I don’t think there’s any chance that I will find a 2022 Novice SoftwareDev job when I’m out of university in 2029” I would strongly agree with that sentiment.
I have a Ron Swanson detachment on the important of formal schooling. If your question was “is a traditional education sequence the best way to prepare myself for the turbulent AI takeoff period,” then I strongly disagree with that statement. Education is intrinsically reflective and backward looking.
But I’m employed as a high school teacher. And your parents have decided to live here and send you to this school . So, I’m not sure if advice on that axis is actionable for either of us. There’s also a huge chasm between “this isn’t be best of all possible options” and “this has zero value.”
If I reframed your statement as “given that I’m in this limited option IB program, what classes will provide me the best foundation to find opportunities and make novel insights in the turbulent AI takeoff period” I would feel confident recommending IBCS.
That doesn’t make learning to code any easier.
Is that a good answer to a 17 year old? Is there a good answer to this?
One of the best parts of teaching is watching young people wake up to the real, fundamental issues and challenges of human civilisation and existence. At some point, their eyes open and they’re amazed/outraged by the complexity and incompetence that make up our world.
My entire push into MakerEd was largely about getting kids to recognise the complexity of their built world, and that they are all smart enough to remake it. I hope that’s still true in 2029.
I think that’s a great answer—assuming that’s what you believe.
For me, I don’t believe point 3 on the AI timelines—I think AGI will probably be here by 2029, and could indeed arrive this year. And even if it goes well and humans maintain control and we don’t get concentration-of-power issues… the software development skills your students are learning will be obsolete, along with almost all skills.
Thanks for the reply to a first post.
While I still have a dominant probability that human civilization will be broadly recognizable in 2034, I know that my confidence on that is proportional to my LW karma.
There’s a lot about communicating with kids as a teacher that pushes me towards Simulacra Level 2 or higher. If we’re on something closer to your 2029 timeline, my honest advice to students would be
Get Out of School ASAP and look for something interesting on the Jagged Frontier (use AI in a way you find useful that few others understand) or dedicate time to building craft skills that would have been recognizable 100 years ago.
My estimate is that I could give that advice to 3-5 students before I had to look for another job.
Gotcha. A tough situation to be in.
What about “Keep studying and learning in the hopes that (a) I’m totally wrong about AGI timelines and/or (b) government steps in and prevents AGI from being built for another decade or so?”
What about “Get organized, start advocating to make b happen?”
I’m on the PauseAI discord in part to expose my students to that level of coordinated planning and direct action.
My Simulacra Level 1 perspective is that most students generally benefit from being in school. While some of that benefit comes from way-down-stream consequences (thankfully, I took an EE class in in 1983..:), a vast majority of the positive benefits happen in the immediate-term.
”Keep studying and learning” is a Simulacra Level 2 admonition that helps maintain the benefits I truly believe are there. (Yes, there are lots of problems in lots of schools. I can only ever speak in aggregates here).
Importantly, a significant number of adolescent problems come from antagonistic relationships between children and theri parents/caregivers. If those adults are supportive of a student leaving school , then I would happily hand them a copy of Blake Boles “College Without High School” ( https://www.blakeboles.com/cwhs/ ). If the adults are insistent on normal “go everyday” school, I think the negative consequences from that fight will most often dominate the positive changes.
What are your thoughts on skills that the government has too much control over? For example If we get ASI in 2030 do you imagine that a doctor will be obsolete in 2032 or will the current regulatory environment still be relevant ?
And how much of this is determined by “labs have now concentrated so much power that governments are obsolete”.
If we get ASI in 2030, all humans will be economically and militarily obsolete in 2030, and probably politically obsolete too (though if alignment was solved then the ASIs would be acting on behalf of the values and intentions of at least some humans). The current regulatory regime will be irrelevant. ASI is powerful.
Also agree on the timelines. If we don’t take some dramatic governance actions, then AGI looks probable in the next 5 years, and very probable in the next 10. And after that, the odds of the world / society being similar to the way it has been for the past 50 years seems vanishingly small. If you aren’t already highly educated in the technical skills needed to help with this, probably political action is your best bet for having a future that conforms to your desires.
You may have already qualified this prediction somewhere else, but I can’t find where. I’m interested in:
1. What do you mean by “AGI”? Superhuman at any task?
2. “probably be here” means >= 50%? 90%?
Yep. Or if we wanna nitpick and be precise, better than the best humans at X, for all cognitive tasks/skills/abilities/jobs/etc. X.
>50%.
<3
That depends on the student. It definitely would have been a wonderful answer for me at 17. I will also say, well done, because I can think of at most 2 out of all my teachers, K-12, who might have been capable to giving that good of an answer to pretty much any deep question about any topic this challenging (and of those two, only one who might have put in the effort to actually do it).
“[I]s a traditional education sequence the best way to prepare myself for [...?]”
This is hard to answer because in some ways the foundation of a broad education in all subjects is absolutely necessary. And some of them (math, for example), are a lot harder to patch in later if you are bad at them at say, 28.
However, the other side of this is once some foundation is laid and someone has some breadth and depth, the answer to the above question, with regards to nearly anything, is often (perhaps usually) “Absolutely Not.”
So, for a 17 year old, Yes. For a 25 year old, you should be skipping as many pre-reqs and hoops as possible to do precisely what you want. You should not spend too much time on the traditional pedagogical steps as once you know enough, a lot can be learned along the way and bootstrapped to what you need while working on harder or more cutting-edge projects or coursework. To do this type of learning, you have to be “all in” and it feels exceedingly hard, but you get to high level. Also, you should not spend too much time on books and curricula that are not very good.
Somewhere in the middle of these two points though, are things that are just being done badly (math, for example, in the USA).
Agree that the delta is “small”, but it might be significant:
LLMs are specially good at coding. Some reasons:
Large amount of training data
Most knowledge is text-based and explicit
Big economic incentive (developers are specially expensive)
Many companies are focused on AI for coding (Devin, GitHub, etc.) and they will probably advance fast because these teams are using their own product (Figma’s CPO on the importance of dogfooding)
LLMs already integrate very well into developers’ workflow
LLMs’ hallucinations are not a big deal in coding because it’s ~easy to review code and mistakes are usually not deadly
LLMs cannot automate bullshit jobs, so many people are safe (half joking)
LLMs will soon have the level of an intern, then junior, then mid, then senior… Can your students outlearn LLMs?