Preferences

It's ironic to see people say this type of things and not think about old software engineer practices that are now obsolete because overtime we have created more and more tools to simplify the craft. This is yet another step in that evolution. We are no longer using punch cards or writing assembly code, and we might not write actual code in the future anymore and just instruct ais to achieve goals. This is progress

> We are no longer using punch cards or writing assembly code

I have done some romhacks, so I have seen what compilers have done to assembly quality and readability. When I hear programmers complain that having to debug AI written code is harder than just writing it yourself, that's probably exactly how assembly coders felt when they saw what compilers produce.

One can regret the loss of elegance and beauty while accepting the economic inevitability.

Not just elegance and beauty, but also functionality. AI is as victim as humans are that if you put all your maximum wit into coding, you won't have any headroom for debugging.
That's a future that's not the case today. With compilers, I rarely have to dig into assembly code and generally I just work in the domain (programming languages) that I'm comfortable in. Compiler bugs are rare (but they do exist and I have had to dig into assembly to debug them before).

LLMs are nowhere close to that level today. It spits out a bunch of mediocre code that the programmer then needs to maintain and fix bugs on. And the larger / more complicated the code base the harder it is for the LLM to work with. There's a huge leaky abstraction here going from initial vibe coding to then having to dig into the weeds, and generally fixing bugs written by another human is difficult enough, not to mention written by some random LLM.

well the only issue I have with that is that coding is already a fairly easy way to encode logic. Sure...writing Rust or C isn't easy but writing some memory managed code is so easy that I wonder whether we are helping ourselves by removing that much thinking from our lives. It's not quite the same optimization as building a machine so that we don't have to carry heavy stones ourselves. Now we are building a machine so we don't have to do heavy thinking ourselves. This isn't even specific to coding, lawyers essentially also encode logic into text form. What if lawyers in the future increasingly just don't bother understanding laws and just let an AI form the arguments?

I think there is a difference here for the future of humanity that has never happened before in our tool making history.

> Now we are building a machine so we don't have to do heavy thinking ourselves.

There are a lot of innovations that helped us not do heavy thinking ourselves. Think calculators. We will just move to a higher level of magnitud problem to solve, software development is a means to an end, instead of thinking hard about coding we should be thinking hard about the problem being solved. That will be the future of the craft.

Calculators are a good example of where letting too much knowledge slip can be an issue. So many are made by people with no grasp of order of operations or choosing data types. They could look it up, but they don't know they need to.

It's one of those problems that seems easy, but isn't. The issue seems to come out when we let an aid for process replace gaining the knowledge behind the process. You at least need to know what you don't know so you can develop an intuition for when outputs don't (or might not) make sense.

https://chadnauseam.com/coding/random/calculator-app

(recently: https://www.hackerneue.com/item?id=43066953)

Complaining about "order of operations" is equivalent to saying Spanish speakers are ignorant because they don't know French.

It's especially silly because one thing calculators are known for is being inconsistent about order of operations between designs.

Management has resented people capable of creating non trivial, novel and valuable software for several decades and trying to get LLMs to generate code for them is just the latest chapter in this power struggle.
The handful of people writing your compilers, JIT-ers, etc. are still writing assembly code. There are probably more of them today than at any time in the past and they are who enable both us and LLMs to write high level code. That a larger profession sprang up founded on them simplifying coding enough for the average coder to be productive didn't eliminate them.

The value of most of us as coders will drop to zero but their skills will remain valuable for the foreseeable future. LLMs can't parrot out what's not in their training set.

this is not progress. this is regression. who is going to maintain and further develop the software if not actual programmers? in the end the LLM's stop getting new information to be trained on, and it cant truly innovate (since its not an AGI)
Elsewhere in this discussion thread[0], 'ChrisMarshallNY compares this to feelings of insecurity:

> It’s really about basic human personal insecurity, and we all have that, to some degree. Getting around it, is a big part of growing up (...)

I believe he's right.

It makes me think back to my teenage years, when I first learned to program because I wanted to make games. Within the amateur gamedev community, we had this habit of sneering at "clickers" - Klik&Play & other kinds of software we'd today call "low-code", that let you make games with very little code (almost entirely game logic, and most of it "clicked out" in GUI), and near-zero effort on the incidental aspects like graphics, audio, asset management, etc. We were all making (or pretending to make) games within scope of those "clickers", but using such tools felt like cheating compared to doing it The Right Way, slinging C++ through blood, sweat and tears.

It took me over a decade to finally realize how stupid that perspective was. Sure, I've learned a lot; a good chunk of my career skills date back to those years. However, whatever technical arguments we levied against "clickers", most of them were bullshit. In reality, this was us trying to feel better, special, doing things The Hard Way, instead of "taking shortcuts" like those lazy people... who, unlike us, actually released some playable games.

I hear echoes of this mindset in a lot of "LLMs will rot your brain" commentary these days.

--

[0] - https://www.hackerneue.com/item?id=43351486

Insecurity is not just a part of growing up, it's a part of growing old as well, a feeling that our skills and knowledge will become increasingly useless as our technologies advance.

Humans are tool users. It is very difficult to pick a point in time and say, "it was here that we crossed the Rubicon". Was it the advent of spoken word? Written word? Fire? The wheel? The horse? The axe? Or in more modern times, the automobile, calculator, or dare I say the computer and the internet?

"With the arrival of electric technology, man has extended, or set outside himself, a live model of the central nervous system itself. To the degree that this is so, it is a development that suggests a desperate suicidal autoamputation, as if the central nervous system could no longer depend on the physical organs to be protective buffers against the slings and arrows of outrageous mechanism."

― Marshall McLuhan, Understanding Media: The Extensions of Man

Are we at battle with LLMs or with humanity itself?

Things like memory safe languages and JS DOM managed frameworks are limited scoped solved problems for most business computing needs outside of some very marginal edge cases.

AI generated code? That seems a way off from being a generalized solved problem in an iterative SDLC at a modern tech company trying to get leaner, disrupt markets, and survive in a complex world. I for one am very much in support of it for engineers with the unaided experience under their belt to judge the output, but the idea that we're potentially going to train new devs at unfamiliar companies on this stuff? Yikes.

Progress is more than just simplistic effort reduction. The attitude of more efficient technology = always good is why society is quickly descending into a high-tech dystopia.
It's business, not technology, that makes us descend into a high-tech dystopia. Technology doesn't bring itself into existence or into market - at every point, there are people with malicious intent on their mind - people who decide to commit their resources to commission development of new technology, or to retask existing technology, specifically to serve their malicious goals.

Note: I'm not saying this is a conspiracy or a cabal - it's a local phenomenon happening everywhere. Lots of people with malicious intent, and even more indifferent to the fate of others, get to make decisions hurting others, and hide behind LLCs and behind tech and technologists, all of which get blamed instead.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal