The work got easier, so what we do got more complex.
Becoming obsolete is a fear of people who are not willing or able to learn arbitrary problem domains in a short amount of time. In that case learning to use a particular tool will only get you so far. The real skill is being able to learn quickly (enthusiasm helps).
Gas powered pogo sticks, shoe fitting X-ray, Radium flavored chocolates, Apollo LLTV, table saws, Flex Seal for joining two halves of boats together, exorbitantly parallelized x86 CPU, rackable Mac Pro with M1 SoC, image generation AI, etc.
Tools can be useless, or be even dangerous.
How processors work, cache and memory work, how the browser works, data structure and algorithms, even design patterns are all important foundationaly knowledge. How to tell an AI to shit out some code or answer a question definitely isn't.
It's all cleaning up slop code. Always has been.
</tired old fart voice>
More optimistically, you can think of "user created code" as an attempt at a design document of sorts; they were trying to tell you (and the computer) what they wanted in "your language". And that dialog is the important thing.
We can never decide, we just like learning, and there is little real, impactful research into programming as a business.
In two decades we will still collectively say ”we are learning so much”, ignoring that fact.
I know enough Japanese to talk like a small child, make halting small talk in a taxi, and understand a dining menu / restaurant signage broadly. I also have been enough times to understand context where literal translation to English fails to convey the actual message.. for example in cases where they want to say no to a customer but can't literally say no.
I have found Google Translate to be similarly magical and dumb for 15 years of traveling to Japan without any huge improvements other than speed. The visual real-time image OCR stuff was an app they purchased (Magic Lens?) that I had previously used.
So who knows, maybe LLM coding stays in a similar pretty-good-never-perfect state for a decade.
I think this is definitely a possibility, but I think the technology is still WAY too early to know that if the "second AI winter" the author references never comes, that we still wouldn't discover tons of other use cases that would change a lot.
Word Lens, by Quest Visual
I'm not sure how seriously people take the threat of non-coding vibe-coders. Maybe they should! The most important and popular programming environment in the world is the spreadsheet. Before spreadsheets, everything that is today a spreadsheet was a program some programmer had to write.
If AI coding improves productivity, it might move us closer to having 2X as much work as we can possibly do instead of 3X.
Nowadays we already have bullshit jobs that keep academics employed. Retraining takes several years.
With "AI" the danger is theoretically limited because it creates more bureaucracy and reduces productivity. The problem is that it is used as an excuse for layoffs.
They don't know the office politics, or go on coffee breaks with the team - humans still have more context and access. We still need people to manage the goals and risks, to constrain the AI in order to make it useful, and to navigate the physical and social world in their place.
The AI slop will make it harder for the small guys without marketing budget (some lucky few will still make it though). It will slowly kill the app ecosystem, untill all we reluctantly trust is FANG. The app pricing reflects it.
That is incorrect, sir.
First, because many problems were designed to fit into spreadsheets (human systems designed around a convenient tool). It is much more likely that several spreadsheets were _paper_ before, not custom written programs. For a lot of cases, that paper work was adapted directly to spreadsheets, no one did a custom program intermediate.
Second, because many problems we have today could be solved by simple spreadsheets, but they often aren't. Instead, people choose to hire developers instead, for a variety of reasons.
I say spreadsheet software replace paper.
That's the disagreement. You have intuition, I have sources:
https://en.wikipedia.org/wiki/Spreadsheet#Paper_spreadsheets
https://en.wikipedia.org/wiki/Spreadsheet#Electronic_spreads...
Meanwhile, the point of software development is not to write code. It's to get a working application that accomplishes a task. If this can be done, even at low quality, without hiring as many people, there is no more value to the human. In HN terms, there is no moat.
It's the difference between the transition from painting to photography and the transition from elevator operators to pushbuttons.
The other is about an "AI" website generator, spamming every video at the start.
I wonder what kind of honest efforts would require that kind of marketing.
I don't think the original point or your interpretation is correct.
AI will not cause a loss of software development jobs. There will still be a demand for human developers to create software. The idea that non-technical managers and executives will do so with AI tools is as delusional as it was when BASIC, COBOL, SQL, NoCode, etc. were introduced.
AI will affect the industry in two ways, though.
First, by lowering the skill requirements to create software it creates a flood of vibe coders competing for junior-level positions. This dilutes the market value of competent programmers, and makes entering the software industry much more difficult.
A related issue is that vibe coders will never become programmers. They will have the ability to create and test software, which will improve as and if AI tools continue to improve, but they will never learn the skills to debug, troubleshoot, and fix issues by actually programming. This likely won't matter to them or anyone else, however, but it's good to keep in mind that theirs is a separate profession from programming.
Secondly, it floods the software market with shoddy software full of bugs and security issues. The quality average will go down causing frustration for users, and security holes will be exploited increasing the frequency of data leaks, privacy violations, and unquantifiable losses for companies. All this will likely lead to a rejection of AI and vibe coding, and an industry crash not unlike the video game one in 1983 or the dot-com one in 2000. This will happen at the bottom of the Trough of Disillusionment phase of the hype cycle.
This could play out differently if the AI tools reach a level of competence that exceeds human senior software engineers, and have super-human capabilities to troubleshoot, fix, and write bug-free software. In that case we would reach a state where AI could be self-improving, and the demand for human engineers would go down. But I'm highly skeptical that the current architecture of AI tools will be able to get us there.
> I feel confident in asserting that people who say this would not have hired a translator or learned Japanese in a world without Google Translate; they’d have either not gone to Japan at all, or gone anyway and been clueless foreigners as tourists are wont to do.
The correlation here would be something like: the people using AI to build apps previously would simply never have created an app, so it’s not affecting software development as a career as much as you first expect.
It would be like saying AI art won’t affect artists, because the people who would put in such little effort probably would never have commissioned anyone. Which may be a little true (at least in that it reduces the impact).
However, I don’t necessarily know if that’s true for software development. The ability to build software enabled huge business opportunities at very low costs. I think the key difference is this: the people who are now putting in such low effort into commissioning software maybe did hire software engineers before this, and that might throw off a lot of the numbers.