Sure I learned lots of stuff I've never used. Like relational algebra. But I also learned lots of stuff I use a lot, and it's unlikely I'd have studied most of that stuff on my own. During my degree I also had time and opportunity to pursue lots of other topics outside the mandated course material, you're not limited to what they force you to learn.
So sure if you have the motivation, discipline and resourcefulness to learn all that stuff on your own go right ahead. Most people aren't even close. Most people are much better off with a degree.
I don't think one can seriously argue that. This as much a meme as anything. I know it's popular to rag on devs writing inefficient software, but there's plenty of apps with functions where a user couldn't possibly notice the difference between O(n^2) and O(1). You wouldn't take the time to make everything O(1) for no speedup because someone told you that's what good code is, that's just wasting dev time.
In fact, one of the first things you learn is that O(1) can be slower. Constant time is not good if the constant is big and n is small.
I fixed one where a report took 25 minutes to generate and after switching out an O(n^2) list lookup with a dict it too less than 5. Still embarrassingly slow but a lot better.
There's also a lot of cases where it didn't matter when the dev wrote it and they had 400 rows in the db but 5 years later theres a lot more rows so now it matters.
Doesn't cost anything to just use a better algorithm. Usually takes exactly the same amount of time, and even if it is marginally slower at small n values who cares? I don't give a shit about saving nanoseconds. I care about the exponential timewaste that happens when you don't consider what happens when the input grows.
For small inputs it doesn't matter what you do. Everything is fast when the input is small. That's why it makes sense to prefer low complexity by default.
In my experience those that lack these do not have chance in tech in the first place, so save yourself lot of debt.
But until then we in the US live in a capitalist hellscape where we have to prioritize survival which means only focusing on marketable skills to get a job. After that one can pay for college once they can afford it if they want that experience for personal enrichment.
I think college is useless for the ones out there whom already know how to code, collaborate and other skills the industry is looking for. Many out there are developing high level projects on GitHub and other places without having any degree.
Also, most of the stuff you learn in college has absolutely no relation to what you will do in the industry.