Preferences

sodality2 parent
This article is spot on about a lot of things. One thing I think it fails to address is this:

> I feel confident in asserting that people who say this would not have hired a translator or learned Japanese in a world without Google Translate; they’d have either not gone to Japan at all, or gone anyway and been clueless foreigners as tourists are wont to do.

The correlation here would be something like: the people using AI to build apps previously would simply never have created an app, so it’s not affecting software development as a career as much as you first expect.

It would be like saying AI art won’t affect artists, because the people who would put in such little effort probably would never have commissioned anyone. Which may be a little true (at least in that it reduces the impact).

However, I don’t necessarily know if that’s true for software development. The ability to build software enabled huge business opportunities at very low costs. I think the key difference is this: the people who are now putting in such low effort into commissioning software maybe did hire software engineers before this, and that might throw off a lot of the numbers.


MarkusQ
Conversely, it may create jobs. Why? Because the more elephants you have in your parade, the more jobs there are for folks to walk behind them with a broom and bucket. For decades we've seen tools that "let users write their own software" and every one of them has driven up the demand for people to clean it up, make it scale, make it secure, or otherwise clean up the mess.
scuff3d
CAD, Matlab, and Altium made electrical and mechanical engineers more valuable, not less.

The work got easier, so what we do got more complex.

seventytwo
They’re all just tools. Use the tools or become obsolete.
catlifeonmars
Kind of a false dichotomy. A great example is debuggers vs print statements. Some people get by just fine with print statements, others lean heavily on debuggers. Another example is IDE vs plain vIM.

Becoming obsolete is a fear of people who are not willing or able to learn arbitrary problem domains in a short amount of time. In that case learning to use a particular tool will only get you so far. The real skill is being able to learn quickly (enthusiasm helps).

numpad0
So, "useless or dangerous tools" is not a self contradictory sentence.

Gas powered pogo sticks, shoe fitting X-ray, Radium flavored chocolates, Apollo LLTV, table saws, Flex Seal for joining two halves of boats together, exorbitantly parallelized x86 CPU, rackable Mac Pro with M1 SoC, image generation AI, etc.

Tools can be useless, or be even dangerous.

scuff3d
AI tools fall onto the category of just in time learning. No, even semi-competent, software engineer is going to become obsolete because they don't know the newest and most hyped AI tool. And anyone stupid enough to hire on that basis isn't worth working for.

How processors work, cache and memory work, how the browser works, data structure and algorithms, even design patterns are all important foundationaly knowledge. How to tell an AI to shit out some code or answer a question definitely isn't.

camillomiller
This has been a stable source of business for a while in my niche.
sodality2 OP
Also true! But that world is one where the vast majority of time is spent cleaning up slop code, so if there's a general shift towards that, I think that still changes the job in a significant way. (I don't have extensive history in the industry yet so I may be wrong here)
MarkusQ
<tired old fart voice>

It's all cleaning up slop code. Always has been.

</tired old fart voice>

More optimistically, you can think of "user created code" as an attempt at a design document of sorts; they were trying to tell you (and the computer) what they wanted in "your language". And that dialog is the important thing.

ryandrake
Seriously. Unless you're one of the vanishingly rare few working with true Greenfield projects that start with an empty text file, you're basically cleaning up other developer's legacy slop.
I mean even when I'm working on my own projects I'm cleaning up whatever code I wrote when I didn't yet know as much about the shape of the problem.
danielscrubs
We still don’t know what good code is. It is all contextual, and we can never decide what that context should be. We are influenced by what is hip today. Right now is static typing using Rust, tomorrow it might be energy usage with assembly, after that it might be Python for productiveness, after that C# for maintenance.

We can never decide, we just like learning, and there is little real, impactful research into programming as a business.

In two decades we will still collectively say ”we are learning so much”, ignoring that fact.

steveBK123
Google translate is a good example too in terms of better-than-nothing to the completely uninitiated, helpful to someone with a little knowledge, and obviously not a replacement for a professional. That is - the more you know, the more you see its failures.

I know enough Japanese to talk like a small child, make halting small talk in a taxi, and understand a dining menu / restaurant signage broadly. I also have been enough times to understand context where literal translation to English fails to convey the actual message.. for example in cases where they want to say no to a customer but can't literally say no.

I have found Google Translate to be similarly magical and dumb for 15 years of traveling to Japan without any huge improvements other than speed. The visual real-time image OCR stuff was an app they purchased (Magic Lens?) that I had previously used.

So who knows, maybe LLM coding stays in a similar pretty-good-never-perfect state for a decade.

sodality2 OP
> So who knows, maybe LLM coding stays in a similar pretty-good-never-perfect state for a decade.

I think this is definitely a possibility, but I think the technology is still WAY too early to know that if the "second AI winter" the author references never comes, that we still wouldn't discover tons of other use cases that would change a lot.

davejagoda
If there is another "AI winter" it would be at least the third one: https://en.wikipedia.org/wiki/AI_winter
> The visual real-time image OCR stuff was an app they purchased (Magic Lens?) that I had previously used.

Word Lens, by Quest Visual

https://en.wikipedia.org/wiki/Quest_Visual

tptacek
The reasonable concern people have about AI eliminating coder jobs is that they will make existing coders drastically more productive. "Productivity" is literally defined as the number X of people required to do Y amount of stuff.

I'm not sure how seriously people take the threat of non-coding vibe-coders. Maybe they should! The most important and popular programming environment in the world is the spreadsheet. Before spreadsheets, everything that is today a spreadsheet was a program some programmer had to write.

simonw
I'm still optimistic that the net effect of making existing programmers drastically more productive is that our value goes up, because we can produce more value for other people.
dfxm12
The economy has taught us that when there is an excess of worker productivity, it leads to layoffs. It certainly does not lead to raises.
ryandrake
No software company I have ever worked at had an excess of worker productivity. There were always at least 3-5X as much work needing to be done, bugs needing to be fixed, features that needed to be implemented than engineers to do it. Backlogs just grew and grew until you just gave up and mass-closed issues because they were 10 years old.

If AI coding improves productivity, it might move us closer to having 2X as much work as we can possibly do instead of 3X.

SoftTalker
I don't think you can judge "work needing to be done" by looking at backlog. Tickets are easy to enter. If they were really important, they'd get done or people would be hired to do them (employed or contracted). 10 year old issues that never got attention were just never that important to begin with.
That sounds like the famous lump of labour fallacy. When something's cheaper people often spend more on it (Jevons paradox).
bgwalter
This "fallacy" is from 1891 and assumes jobs that require virtually no retraining. A farm worker could ion theory clean the factory floor or do one small step in an assembly line within a week.

Nowadays we already have bullshit jobs that keep academics employed. Retraining takes several years.

With "AI" the danger is theoretically limited because it creates more bureaucracy and reduces productivity. The problem is that it is used as an excuse for layoffs.

kasey_junk
Do you have a citation for that?
ohthatsnotright
What a strange thing to ask for a citation on when CEO pay, stock buy backs and corporate dividends are at all time highs while worker pay and honestly just affording to live continue to crater.
I rather think that LLMs help to write code faster, and also enables folks that would not program to do so in some capacity. In the end, you end up with more code in the world, and you end up needing more programmers to maintain/keep it running at scale.
visarga
LLMs don't care you have to maintain the code, they don't get any benefit or loss from their work and are unaccountable when they fuck up. They have no skin in the game.

They don't know the office politics, or go on coffee breaks with the team - humans still have more context and access. We still need people to manage the goals and risks, to constrain the AI in order to make it useful, and to navigate the physical and social world in their place.

danielscrubs
But when everyone started to produce SEO slop, the web died. It’s harder than ever to find truly passionate, single subject blogs from professionals for example.

The AI slop will make it harder for the small guys without marketing budget (some lucky few will still make it though). It will slowly kill the app ecosystem, untill all we reluctantly trust is FANG. The app pricing reflects it.

alganet
> everything that is today a spreadsheet was a program some programmer had to write

That is incorrect, sir.

First, because many problems were designed to fit into spreadsheets (human systems designed around a convenient tool). It is much more likely that several spreadsheets were _paper_ before, not custom written programs. For a lot of cases, that paper work was adapted directly to spreadsheets, no one did a custom program intermediate.

Second, because many problems we have today could be solved by simple spreadsheets, but they often aren't. Instead, people choose to hire developers instead, for a variety of reasons.

tptacek
I'm not sure we're really disagreeing about anything here. If you think spreadsheets didn't displace any programmers at all, that's contrary to my intuition, but not necessarily wrong --- especially because of the explosion of extrinsic demand for computer programming.
alganet
You say spreadsheet software replace programmer.

I say spreadsheet software replace paper.

That's the disagreement. You have intuition, I have sources:

https://en.wikipedia.org/wiki/Spreadsheet#Paper_spreadsheets

https://en.wikipedia.org/wiki/Spreadsheet#Electronic_spreads...

Miraste
I think you're right, AI art and AI software dev are not analogous. The point of art is to create art. There are a lot of traditions and cultural expectations around this, and many of them depend on the artist involved. The human in the loop is important.

Meanwhile, the point of software development is not to write code. It's to get a working application that accomplishes a task. If this can be done, even at low quality, without hiring as many people, there is no more value to the human. In HN terms, there is no moat.

It's the difference between the transition from painting to photography and the transition from elevator operators to pushbuttons.

bgwalter
I'm currently getting two types of ads on YouTube: One is about how the official Israeli Gaza humanitarian efforts are totally fine and adequate (launched during the flotilla with Greta Thunberg).

The other is about an "AI" website generator, spamming every video at the start.

I wonder what kind of honest efforts would require that kind of marketing.

candiddevmike
I think you could extrapolate it and say folks are primarily using GenAI for things they aren't considered a specialist in.
15123123
yeah I think the case for AI art is very different. I see major brands, even those who has been very generous with artist commission like McDonald Japan. is now using AI art instead.
imiric
> The correlation here would be something like: the people using AI to build apps previously would simply never have created an app, so it’s not affecting software development as a career as much as you first expect.

I don't think the original point or your interpretation is correct.

AI will not cause a loss of software development jobs. There will still be a demand for human developers to create software. The idea that non-technical managers and executives will do so with AI tools is as delusional as it was when BASIC, COBOL, SQL, NoCode, etc. were introduced.

AI will affect the industry in two ways, though.

First, by lowering the skill requirements to create software it creates a flood of vibe coders competing for junior-level positions. This dilutes the market value of competent programmers, and makes entering the software industry much more difficult.

A related issue is that vibe coders will never become programmers. They will have the ability to create and test software, which will improve as and if AI tools continue to improve, but they will never learn the skills to debug, troubleshoot, and fix issues by actually programming. This likely won't matter to them or anyone else, however, but it's good to keep in mind that theirs is a separate profession from programming.

Secondly, it floods the software market with shoddy software full of bugs and security issues. The quality average will go down causing frustration for users, and security holes will be exploited increasing the frequency of data leaks, privacy violations, and unquantifiable losses for companies. All this will likely lead to a rejection of AI and vibe coding, and an industry crash not unlike the video game one in 1983 or the dot-com one in 2000. This will happen at the bottom of the Trough of Disillusionment phase of the hype cycle.

This could play out differently if the AI tools reach a level of competence that exceeds human senior software engineers, and have super-human capabilities to troubleshoot, fix, and write bug-free software. In that case we would reach a state where AI could be self-improving, and the demand for human engineers would go down. But I'm highly skeptical that the current architecture of AI tools will be able to get us there.

This item has no comments currently.