A chemistry, physics, or even MechE BS is coming out only at the very beginning of their training, and will require lots of specific on-the-job training if they go into industry. School is about the principles of the field and how to think critically / experimentally. E.g. software debugging requires an understanding of hypothesis testing and isolation before the details of specific tech ever come into play. This is easy to take for granted because many people have that skill naturally, others need to be trained and still never quite get it.
Edit: of course if only 5% of grads are going on to research then maybe the department is confused. A lot of prestigious schools market themselves as research institutions and advertise the undergrad research opportunities etc. If you choose to go there then you know what you're getting into.
Out of one side of their mouth maybe.
Out of the other, they absolutely are not telling potential undergrads that they may tolerate them but they're really focused on research.
This. I went to the University of Iowa in the aughts. My experience was that because they didn't cover a lot of the same material in this MIT Missing Semester 2026 list, a lot of the classes went poorly. They had trouble moving students through the material on the syllabus because most students would trip over these kinds of computing basics that are necessary to experiment with the DS+A theory via actual programming. And the department neither added a prereq that covers these basics or nor incorporated them into other courses's syllabi. Instead, they kept trying what wasn't working: having a huge gap between the nominal material and what the average student actually got (but somehow kept going on to the next course). I don't think it did any service to anyone. They could have taken time to actually help most students understand the basics, they could have actually proceeded at a quicker pace through the theoretical material more for the students who actually did understand the basics, they could have ensured their degree actually was a mark of quality in the job market, etc.
It's nice that someone at MIT is recognizing this and putting together this material. The name and about page suggest though it's not something the department has long recognized and uncontroversially integrated into the program (perhaps as an intro class you can test out of), which is still weird.
While this comes out of CSAIL, I wouldn't ascribe too much institutional recognition to this. Given the existence of independent activities period, it's probably a reasonable place for it given MIT's setup. Other institutions have "math camp" and the like pre-classes starting.
It's probably a reasonable compromise. Good schools have limited bandwidth or interest in remedial education/hand-holding and academics don't have a lot of interest in putting together materials that will be outdated next year.
I think they rarely escape doing this hand-holding unless they're actually willing to flunk out students en masse. Maybe MIT is; the University of Iowa certainly wasn't. So they end up just in a state of denial in which they say they're teaching all this great theoretical material but they're doing a half-assed job of teaching either body of knowledge.
I also don't think this knowledge gets outdated that quickly. I'd say if they'd put together a topic list like this for 2006, more than half the specific tools would still be useful, and the concepts from the rest would still transfer over pretty well to what people use today. For example, yeah, we didn't have VS Code and LSP back then, but IDEs didn't look that different. We didn't (quite) have tmux but used screen for the same purpose. etc. Some things are arguably new (devcontainers have evolved well beyond setting up a chroot jail, AI tools are new) but it's mostly additive. If you stay away from the most bleeding-edge stuff (I'm not sure the "AI for the shell (Warp, Zummoner)" is wise to spend much time on) you never have to throw much out.
There certainly are fits and starts in the industry. I'm not sure the past 5 years or so looks THAT different from today. (Leaving aside LLMs.)
From my peripheral knowledge, MIT does try to hand-hold to some degree. Isn't the look-left and look-right, one of those people won't be here next year sort of places. But, certainly, people do get in over their head at some places. I tutored/TAd in (business) grad school and some people just didn't have the basics. I couldn't do remedial high school arithmetic from the ground up--especially for some people who weren't even willing to try seriously.
I could see it being obsolete quickly to the extent that when someone was trying to learn devops and saw a book on the (virtual) shelf that didn't cover containers next to one that did, they'd pick the latter every time. You probably saw this in your sales tanking. But I'm not sure many of the words you actually did write became wrong or unimportant either. That's what I mean by additive. And in the context of a CS program, even if their students were trying out these algorithms with ridiculously out-of-date, turn-of-the-century tools like CVS, they'd still have something that works, as opposed to fumbling because they have no concept of how to manage their computing environment.
the same MIT that doesn't give out grades in the first year? (just Pass / NoPass)
the high achievers who scored solid grades to get there literally kill themselves when they pull Cs and Ds, even though it's a hard class and is sort of "look left, look right"
Yes, poor grades were often a shock to people accustomed to being straight A students in high school. Though most made it through or ended up, in some cases, going elsewhere.
I think there are a number of ways in which financial incentives and University culture are misaligned with this reality.
But it's also the case that (only half-joking) a lot of faculty at research universities regard most undergrads as an inconvenience at best.
In my experience, the more advanced the material, the worse the teachers are. Or more precisely, the improvement in teaching does not usually keep up with the increase in difficulty. (There appears to be no correlation, in fact.)
Which implies that the better a university is (the more difficult the material), the more it relies on filtering rather than education.
Which seems to be in line with how the top universities are perceived anyway as selection criteria, primarily places to get a network, rather than places to get an education.
It's neither good nor bad, but it is a little sad :)
---
I do notice that my assumption here is that the more difficult the university is, the better it is. I think this is broadly true, both objectively and subjectively, at least for my purposes.
I thought that was pretty strange at the time because like 5% of the students end up going into research. So that was basically like him saying I'm totally cool with our educational program being misaligned for 95% percent of our customers...
Maybe it makes sense for the big picture though. If all the breakthroughs come from those 5%, it might benefit everyone to optimize for them. (I don't expect they would have called the program particularly optimized either though ;)