Preferences

psunavy03 parent
"Great news, boss! We invented this new tool that allows nontechnical people to write code in English! Now anyone can deploy applications, and we don't have to hire all those expensive developers!"

"Wow, show it to me!"

"OK here it is. We call it COBOL."


musicale
FORTRAN (FORmula TRANslator) was another "AI" project in "automatic programming":

"Before 1954, almost all programming was done in machine language or assembly language. Programmers rightly regarded their work as a complex, creative art that required human inventiveness to produce an efficient program."

-John Backus, "The History of Fortran I, II, and III", https://dl.acm.org/doi/10.1145/800025.1198345

"The IBM Mathematical Formula Translating System or briefly, FORTRAN, will comprise a large set of programs to enable the IBM 704 to accept a concise formulation of a problem in terms of a mathematical notation and to produce automatically a high speed 704 program for the solution of the problem."

-IBM, "Specifications for the IBM Mathematical FORmula TRANslating System, FORTRAN", http://archive.computerhistory.org/resources/text/Fortran/10...

aitchnyu
Fortran promised to eliminate debugging. In 2015, I taught React is a functional programming way to create very fast, bug free apps and the project manager found ways to push us to the hair-on-fire status quo.

"FORTRAN should virtually eliminate coding and debugging" https://www.hackerneue.com/item?id=3970011

Gravityloss
SQL had similar promises.

But it still has been immensely useful and a durable paradigm, even though usage hasn't been exactly as thought.

Excel enters the chat
bonoboTP
For some strange reason Excel really managed to do it. Many many people who don't think of themselves anywhere near being a programmer, somehow get at ease in front of Excel enough that they often inadvertently and kind of unawarely end up learning programming concepts and creating much more complex computational applications than its been possible with any other tool for non-developers.
Izkata
"Now hold still, I'm about to perform a miracle."

https://www.youtube.com/watch?v=kOO31qFmi9A

glimshe
You're joking but it's true. I'm sure you know that. SQL had similar claims... Declarative, say what you need and the computer will do for you. Also written in English.
And compared to what we had before SQL, it is much easier to use, and a lot more people are able to use it.
noworriesnate
But software developers often struggle to use sql and prefer using ORMs or analytical APIs like polars; the people who excel at sql are typically not programmers, they’re data engineers, DBAs, analysts, etc.

Maybe a similar bifurcation will arise where there are vibe coders who use LLMs to write everything, and there are real engineers who avoid LLMs.

Maybe we’re seeing the beginning of that with the whole bifurcation of programmers into two camps: heavy AI users and AI skeptics.

ruszki
What you can achieve with the standard SQL is taught on universities. The whole package. I’ve never met a developer, who struggled with that. When you use ORMs you need to follow SQL’s logic anyway. People use ORMs to avoid painful data conversions. Not to avoid the logic. Data engineers, DBAs, analysts, etc excel in specific databases, not in “SQL”.
FranzFerdiNaN
Ive worked in BI and data engineering my whole career and I’ve met plenty of programmers who struggled immensely with SQL once it went further than select and group by. And don’t get me started about their database design skills. It’s way too often a disaster hidden behind “it works for the software so good enough”.

Im more surprised by software engineers who do know these things than by the ones who don’t.

maccard
I’ve worked with gameplay programmers who can’t do simple 3D math, c++ programmers who fundamentally don’t understand pointers, backend developers who didn’t understand globals were shared state and cause race conditions, etc.

It’s not that SQL is hard, it’s that for any discipline the vast majority of people don’t have a solid grasp of the tools they’re using. Ask most tradespeople about the underlying thing they’re working with and you’ll have the same problem.

adalacelove
I'm a developer and: - I hate ORMs, they are the source for a lot of obscure errors behind layers and layers of abstractions. - I prefer analytical APIs for technical reasons, not just the language.

Reasons: - I can compose queries, which in turn makes them easier to decompose - It's easier to spot errors - I avoid parsing SQL strings - It's easier to interact with the rest of the code, both functions and objects

If I need to make just a query I gladly write SQL

Well, the problem in ORM is the O. Objection-orientation is just a worse way to organise your data and logic than relational algebra.

It's just a shame that many languages don't support relational algebra well.

We had relations as a datatype and all the relevant operations over them (like join) in a project I was working on. It was great! Very useful for expressing business logic.

idiotsecant
'real' engineers can use SQL just fine. This is a strange position to take.
collingreen
No true Scotsman would struggle with sql
atomicnumber3
Aren't data engineers programmers? That is to say, a data engineer is-a software engineer?

I share your sentiment though - I'm a data engineer (8 years) turned product engineer (3 years) and it astounds me how little SQL "normal" programmers know. It honestly changed my opinion on ORMs - it's not like the SQL people would write exceeds the basic select/filter/count patterns that is the most that non-data people know.

sanderjd
> But software developers often struggle to use sql

Is this true? It doesn't seem true to me.

winnie_ua
Oh, sweet summer child.

Yes, there are so many so called developers in backend field of work who do not know how to do basic SQL. Anything bigger than s9imple WHERE clause.

I wouldn't even talk about using indexes in database.

sanderjd
I've been doing this for over fifteen years. I haven't ever worked with anyone who is good at the job in general but can't figure out SQL.
nathanfig
Claude made this point while reviewing my blog for me: the mechanization of farms created a whole lot more specialization of roles. The person editing CAD diagrams of next year's combine harvester may not be a farmer strictly speaking, but farming is still where their livelihood comes from.
dredmorbius
Strictly speaking, farming is where all our livelihoods come from, in the greatest part. We're all living off the surplus value of food production.

(Also of other food, energy, and materials sourcing: fishing, forestry, mining, etc.)

This was the insight of the French economist François Quesnay in his Tableau économique, foundation of the Physiocratic school of economics.

lipowitz
Removing jobs that could only be performed by those living near the particular fields with those that can be done anywhere makes jobs for the person willing to take the least satisfactory compensation for the most skill and work.

Working the summer fields was one of the least desirable jobs but still gave local students with no particular skills a good supplemental income appropriate for whichever region.

swader999
Land is scarce though. The amount of software work that needs doing might not be, it could be infinite or probably more tied to electrical capacity.
downrightmike
They're all real programmers John
ameliaquining
Is that really because of the English-esque syntax, rather than because it was a step forward in semantic expressivity? If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
> Is that really because of the English-esque syntax

Well, what we had before SQL[1] was QUEL, which is effectively the same as Alpha[2], except in "English". Given the previous assertion about what came before SQL, clearly not. I expect SQL garnered favour because it is tablational instead of relational, which is the quality that makes it easier to understand for those not heavy in the math.

[1] Originally known as SEQUEL, a fun word play on it claiming to be the QUEL successor.

[2] The godfather language created by Codd himself.

dmkolobov
Do you have any advice for understanding the difference between "relational" and "tablational"? I remember hearing something about how SQL is not really relational from my college professor, but we never really explored that statement.
AdieuToLogic
Before SQL became an industry standard, many programs which required a persistent store used things like ISAM[0], VISAM (a variant of ISAM[0]), or proprietary B-Tree libraries.

None of these had "semantic expressivity" as their strength.

> If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?

Yes.

0 - https://en.wikipedia.org/wiki/ISAM

charlieyu1
Don’t think the person was joking. It was literally the promise of COBOL
Er, have you heard of datalog or Prolog? Declarative programming really does work. SQL was just... Botched.
glimshe
Yes. And I think SQL is actually pretty good for what it does. My point, as the parent's (I suppose) is that we've heard this "XYZ, which uses natural language, will kill software development" before.
sanderjd
I think SQL is better than datalog. I suspect this is one of those opinions that may be somewhat outside consensus on a forum like HN, but is strongly within the consensus more broadly.
aeonik
I like both, they excel at different things.

SQL is pretty horrifying when you start getting tree structures or recursive structures. While Datalog handles those like a champ.

SQL is really nice for columnar data, and it's well supported almost everywhere.

Though Datalog isn't half bad at columnar data either.

sanderjd
Yeah. I think you and I have the same opinions about this :)

I like SQL more when working with tabular data (especially for analytical use cases), but that has (thus far) dominated the kinds of work I've done.

kayodelycaon
SQL and many DSLs (JIRA…) are actually used by plenty of non-technical users. Anyone who wants to build their own reports and do basic data analysis has sufficient incentive to learn it.

They are very much the exception that proves the rule though.

dredmorbius
I'd long ago (1990s-era) heard that the original intent was that office secretaries would write their own SQL queries.

(I'd love for someone to substantiate or debunk this for me.)

rsynnott
That's always the promise of these things; non-specialists will be able to program now! This has been going on since COBOL. The one case where it arguably worked out to some extent was spreadsheets.
bluGill
Anyone with complex spreadsheets (which is a lot of companies) has a few programmers with the job of maintaining them. The more training those people have in "proper programming" the better the spreadsheets work.
jimbokun
I would say that failed with SQL but succeeded with Excel. If you replace "office secretaries" with "office workers" in general.
ahmeneeroe-v2
It kinda came true. "Office secretaries" became PMs/junior analysts/etc and those people generally know basic SQL nowadays
gsinclair
I heard that from a lecturer too. It really stuck in the memory.
bazoom42
Early on, programming was considered secretarial work.
AdieuToLogic
> Early on, programming was considered secretarial work.

Incorrect.

Encoding a program was considered secretarial work, not the act of programming itself. Over time, "encoding" was shortened to "coding."

This is why the industry term "coder" is a pejorative descriptor.

> This is why the industry term "coder" is a pejorative descriptor.

For some people some of the time. I don't think that's true in general.

0points
> This is why the industry term "coder" is a pejorative descriptor.

It is not.

brabel
It used to be widely seen as such. See for example Stallmanns latest post where he mentions that. Coder was not the same as programmer, it was the lesser half of the job. Nowadays the term has lost its original meaning.
Or QBE, "Query By Exemple", that was another try by IBM to make a query language directly usable by anyone.
bitpush
Bravo. This is the exact sentiment I have, but you expressed in a way that I could never have.

Most people miss the fact that technical improvements increases the pie in a way that was not possible before.

When digital cameras became popular, everybody become a photographer. That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.

And same with coding & LLMs. World will have lots more of apps, and programmers.

munificent
> That only made the world better, and we got soo many more good photographers.

I disagree with the "only" part here. Imagine a distribution curve of photos with shitty photos on the left and masterpieces on the right and the height at the curve is how many photos there are to be seen at that quality.

The digital camera transition massively increased the height of the curve at all points. And thanks to things like better autofocus, better low light performance, and a radically faster iteration loop, it probably shift the low and middle ends to the right.

It even certainly increased the number number of breathtaking, life-changing photos out there. Digital cameras are game-changes for photographic journalists traveling in difficult locations.

However... the curve is so high now, the sheer volume of tolerably good photos so overwhelming, that I suspect that average person actually sees fewer great photos than they did twenty years ago. We all spend hours scrolling past nice-but-forgottable sunset shots on Instagram and miss out on the amazing stuff.

We are drowning in a sea of "pretty good". It is possible for there to be too much media. Ultimately, we all have a finite amount of attention to spend before we die.

DavidPiper
Thank you for describing this so eloquently.

Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.

If we can get AIs to build "pretty good" things - or even just "pretty average" things - cheaply, then our app stores, news feeds, ad feeds, company directives, etc, will be continuously swamped with it.

> Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.

You can use AI to filter out the shovelware, so you never have to see it.

shinedog
You hit this so hard it was impossible not to recognize. In every sense there is too much "ok" shit (in every media realm) that we cannot help but miss amazing stuff. Knowing that I don't have enough time for all the incredible things that technology has enabled crushes me.
jimbokun
All of the old, great classic movies are available for streaming somewhere.

I still find great value in the TCM cable channel. Simply because if I tune in at a random time, it's likely to be showing an excellent old film I either never heard of or never got around to watching.

The service they are offering is curation, which has a lot of value in an age of infinite content flooding our attention constantly.

Anamon
TCM is also my go-to example for why linear TV still has value in the age of streaming. You not only get curation and exposure to stuff you didn't know about, but also knowledgeable people putting it in context and explaining backgrounds to you. Such a wonderful channel.
munificent
...and now think how much worse this problem will become now that we're in the era of generative AI.
kjkjadksj
It affects even the competent photographer. How many times do you see that photographer with all the gear sit in front of a literal statue and fire off a 30 shot burst in 2 seconds? I don’t envy these pro photo editors either today in sports. I wonder how many shots they have to go through per touchdown from all the photographers at the end zone firing a burst until everyone stands up and throws the ball back at the ref? After a certain point you probably have to just close your eyes and pick one of the shots that looks almost identical to another 400. Not a job for analysis paralysis people. I guess it sure beats having to wait for the slide film to develop.
bluGill
I suspect most of the time you can eliminate 300 of those 400 right away - they obviously are either too early or too late to capture the moment. On the remaining 100 you can choose any one (or more likely 5 as there are likely several moments - the moment the catch is made and the moment the athlete smiles as he realizes he made that catch).

The reason to take all 400 though as every once in a while one photo is obviously better than another for some reason. You also want several angles because sometimes the light will be wrong at the moment, or someone will happen to be in the way of your shot...

dotancohen
The AI is already picking out the best photo in those 400-shot bursts.

And sometimes it is even combining elements from different photos: Alice had her eyes closed in this otherwise great shot, but in this other shot her eyes were open. A little touch-up and we've got the perfect photo.

kjkjadksj
What AI does this right now?
socalgal2
don't you just let the AI pick? I'm only half joking. I thought that was a feature added to smartphones a year or two ago?
test6554
Experts warn that at current production levels, the supply of dick pics may actually outpace demand in a couple decades.
DavidPiper
I was under the impression that supply already vastly outstrips demand.
Demand is very unevenly distributed. I think they are appreciated on Grindr.
dijksterhuis
> That only made the world better

Did it?

people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.

people taking their phone out and videoing / photographing something awful happening, instead of doing something helpful.

people travel to remote areas where the population has been separated from humanity and do stupid things like leave a can of coke there, for view count.

it’s not made things better, it just made things different. whether that’s better or worse depends on your individual perspective for a given example.

so, i disagree. it hasn’t only made things better. it made some things easier. some things better. some things worse. some things harder.

someone always loses, something is always lost. would be good if more people in tech remembered that progress comes at a cost.

skeeter2020
Live music sucks when you're trying to watch the show and some dumb-dumb is holding their phone above their head to shoot the entire show with low-light, bad angle & terrible sound. NO ONE is going to watch that, and you wrecked the experience for many people. Put your phone away and live in the present, please...
thangalin
> people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.

There are other types of dances where dancers are far more interested in the dance than selfies: Lindy Hop, Blues, Balboa, Tango, Waltz, Jive, Zouk, Contra, and West Coast Swing to name a few. Here are videos from the Blues dance I help organize where none of the dancers are filming themselves:

* https://www.facebook.com/61558260095218/videos/7409340551418...

* https://www.facebook.com/reel/3659488930863692

skeeter2020
Thank you for sharing your social media videos as evidence in a rebuttal to "camera phones are not all good; they're ubiquitous use has negative implication too". So delicious...
VonTum
The irony!

Though, I'll grant that there's not really a way to argue this without showing videos

kjkjadksj
That sort of dancing is basically a sport. You have to learn it, you have to get good at it after you learned it, and it is cardio after all. I think op was talking more about what you see in the edm scene these days. Where basically people aren’t there to dance like the old days or sing along like other genres, they are there to see a certain DJ and then they will post clips from the entire set on their instagram story. And they can do this because the dancing they are doing at the edm show is super passive kind of dancing where you are just swaying a little so you can hold the phone stably at the same time. If you were dancing like how they’d dance at the edm concerts in the 90s all rolling on molly it would be like your blues swing where its just too physical to do anything but rave around flinging your arms all around shirtless and sweaty.
ttoinou
Look into contact impro and ecstatic dance : cellphones are forbidden and you can dance however you like it
flashgordon
I would add one thing though. The pie definitely gets bigger - but i feel there is a period of "downsizing" that happens. I think this is becuase of lack of ideas. When you have tool that (say) 10xes your productivity, its not that bosses will have ideas to build 10x the number of things - they will just look to cut costs first (hello lack of imagination and high interest rates).
sarchertech
We’ve had many improvements that increased productivity at least as much as current LLMs, and I don’t think any of them ever temporarily caused downsizing in the total number of programmers.
flashgordon
Is it possible that we dont remember them precisely because they are temporary (atleast in the grand scheme of things)?
sarchertech
It think the only way that’s hiding in the data is if these downturns happen to always coincide with large economic downturns in general.

Or if they are so temporary that they last less than a year.

I thought photographers don't get paid well anymore due market saturation and few skills required to get a good photo?
kjkjadksj
It is still as hard as its been to get a good photo. They had full auto film cameras that could take good photos in the 70s but the devil is always the edge cases and the subconscious ability to take an evenly exposed (in the Ansel Adams definition not auto camera exposure definition), well composed image at the decisive moment. Understanding how lighting works (either natural, or different artificial light like flash or studio lighting) is also not easy.

It is pretty hard to break out but people still make names for themselves either from experience on assignments like the old days but also from instagram and other social media followings. People still need weddings shot and professional portraits taken which takes some skill in understanding the logistics of how to actually do that job well efficiently and managing your equipment.

bluGill
As I said in a sibling reply: practice is much easier and so it is much easier to get good. Film was expensive and so few could afford to become good photographers. Sure everyone had a camera, many of them nice SLRs with decent lens (but probably not auto focus - for both better and worse), but it wouldn't take a lot of photos to exceed that cost in film.
This implies photographers used to be paid well in the past, which isn't true. Like painting or rock music, photography has always been a winner-takes-all kind of market where a select few can get quite wealthy but the vast majority will be struggling forever.
bluGill
While photographer was never a sure path to rich and famous, professionals used to do very good business and make a good living.

Demand is way down because while a $5000 lens on a nice camera is better than my phone lens, my phone is close enough for most purposes. Also my phone is free, in the days of film a single roll of film by the time you developed it costs significant money (I remember as a kid getting a camera for my birthday and then my parents wouldn't get me film for it - on hindsight I suspect every roll of film cost my dad half an hour of work and he was a well paid software developer). This cost meant that you couldn't afford to practice taking pictures, every single one had to be perfect. So if you wanted a nice picture of the family it was best to pay a professional who because of experience and equipment was likely to take a much better one than you could (and if something went wrong they would retake for free).

bluefirebrand
> World will have lots more of apps, and programmers.

This is actually bad for existing programmers though?

Do you not see how this devalues your skills?

platevoltage
I see your point, but I'm having personally having a different experience.

A client of mine has gotten quite good at using Bolt and Lovable. He has since put me on 3 more projects that he dreamed up and vibe coded that would just be a figment of his imagination pre-AI.

He knows what's involved in software development, and knows that he can't take it all the way with these tools.

sarchertech
There are far more programmers now than in 1980, yet the average programmer makes far more (inflation adjusted) now.
Funes-
And the quality of what they develop is in the gutter, on average.
jimbokun
It was in 1980, too.
Funes-
Absolutely not, not to the same extent. That's a really illogical statement on your part, considering that the technical barrier to entry to even begin to think about developing a program in 1980 was much, much higher than what it's been for more than a decade now.
kjkjadksj
Thank the Bangalore office for that.
FirmwareBurner
How much online shopping could you do from your PC in 1980? How many people had smartphones in 1980?

That's why sw devs salaries went up like crazy in our time and not in 1980.

But what new tech will we have, that will push the SW dev market demand up like internet connected PCs and smartphones did? All I see is stagnation in the near future, just maintaining or rewriting the existing shit that we have, not expanding into new markets.

Maintaining and rewriting existing shit is quite well paying though, and also something that AI seems to struggle with. (Funnily enough, AI seems to struggles even more with refactoring vibecoded projects than with refactoring human-written apps. What that says about the quality of the vibe coded code I don't know.)
FirmwareBurner
Tech salaries grew because beyond rewriting and maintaining shit, even more new shit was being built from scratch.

How will the job market look like when it's all rewriting and maintaining the existing shit?

bitpush
In the current state, yes. But that is also an opportunity, isn't it?

When online flight bookings came about, travel agents were displaced. The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system

dijksterhuis
Why does every system need to be efficient?
komali2
Under capitalism, because greater margins. Under not-capitalism, so as to free up resources and labor for other things or just increase available downtime for people.
Funes-
>Under capitalism, because greater margins

Under capitalism, or late-stage capitalism, if you will, more efficient procedures aren't normally allowing for greater margins. There are countless examples of more exploitative and wasteful strategies yielding much greater margins than more efficient alternatives.

hackernoops
Fractional reserve lending, rehypothecation, etc.
lupire
Sorry to be that guy, but would to prefer if your computer and phone each cost $5000?
insane_dreamer
> everybody become a photographer. That only made the world better, and we got soo many more good photographers.

Not sure I agree. I haven't seen much evidence of "better photography" now that it's digital instead of film. There are a million more photos taken, yes, because the cost is zero. But quantity != quality or "better", and if you're an average person, 90% those photos are in some cloud storage and rarely looked at again.

You could argue that drones have made photography better because it's enabled shots that were impossible or extremely difficult before (like certain wildlife/nature shots).

One thing digital photography did do is decimate the photographer profession because there is so much abundance of "good enough" photos - why pay someone to take good ones? (This may be a lesson for software development too.)

bluGill
While the vast majority of photos are bad, there are still more great photos mixed in than ever before. You of course won't see them because even great photos are hid from view in all the noise, but they are still there.
20after4
And now the business of wedding / portrait photographer has become hyper-competitive. Now everyone's cousin is an amateur photographer and every phone has an almost acceptable camera built in. It is much more difficult to have a profitable photography business compared to 20 years ago.
bachmeier
That's good to hear. Back when I got married there were some real jerks in the wedding photography business, and they weren't worried about running out of customers. Here's an actual conversation I had with one of them:

Me: "I'm getting married on [date] and I'm looking for a photographer."

Them, in the voice of Nick Burns: "We're already filling up for next year. Good luck finding a photographer this year."

Me: "I just got engaged. You never have anything open up?"

Them: "No" and hang up the phone.

The faster guys like that struggle to make a living, the better.

tonyedgecombe
I know a couple of professional photographers and neither of them will do weddings. It seems many of the clients are as bad as the photographers.
LargeWu
In the same breath, those photographers will complain about all the "amateurs" devaluing their services.
NewsaHackO
Definitely. What matters more is that the ability to take photos is available to more people, which is a net positive.
The game has definitely changed. It used to be profitable to be a photographer for hire, and that’s no longer the case. But the revenue generated through pictures (by influencers) has increased a lot.

If today all you do as a programmer is open jira tickets without any kind of other human interaction, AI coding agents are bad news. If you’re just using code as a means to build products for people, it might be the best thing that has happened in a long time.

jimbokun
> But the revenue generated through pictures (by influencers) has increased a lot.

So the job qualifications went from "understand lighting, composition, camera technology" to "be hot".

deanCommie
> That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.

I think you really missed the point of what these technologies and innovations actually did for society and how it applies to today, underneath the snark.

In the 1970's, if you got gifted a camera, and were willing to put in the work to figure out how to use it, you learned a skill that immediately put you in rare company.

With enough practice of that skill you could be a professional photographer, which would be a good , reliable, well paid job. Now, the barrier of entry is nothing, so it's extremely competitive to be a professional photographer, and even the ones that succeed just scrape by. And you have to stand out on other things than the technical ability to operate a camera.

That's...what's about to happen (if it hasn't already) with software developers.

bluGill
> In the 1970's, if you got gifted a camera, and were willing to put in the work to figure out how to use it, you learned a skill that immediately put you in rare company.

Everyone in the 1970s was gifted a camera. Many of them got a nice SLR with better lens than a modern smart phone. Cameras were expensive, but within reach of most people.

Film was a different story. Today you can get 35mm film rolls for about $8 (36 pictures), and $13 to develop (plus shipping!), and $10 for prints (in 1970 you needed prints for most purposes, thought slides were an option), so $31 - where I live McDonalds starts you are $16/hour, that roll of film costs almost 2 hours work - before taxes.

Which is to say you couldn't afford to become skilled in 1970 unless you were rich.

iamflimflam1
I think what we forget is these high level languages did open up programming to people who would have been considered “nontechnical” back in the day.
This is true, but: - there are more programmers today than there were back then - the best programmers are still those who would be considered technical back then
RickJWagner
Very true.

But consider this— back in the day, how many mainframe devs ( plus all important systems programmer! ) would it take to conjure up a CRUD application?

Did you forget the vsam SME or dba? The CICS programming?

Today, one person can do that in a jiffy. Much, much less manpower.

That might be what AI does.

ashoeafoot
We now have assembler, now anyone can program.

No, wait it was called natural language coding, now anyone can code.

No, wait it was called run anything self fixing code. No wait, simplified domain specific language.

No, wait it was uml based coding.

No, wait excel makros.

No, wait its node based drag and drop .

No, wait its LLMs.

The mental retardation of no code is strong with the deciding caste, every reincarnation must be taxed.

The big difference with LLM is that you don't have to have a conherant and logical thought, and the LLM will "fix" that for you by morphing it into the nearest coherent expression and show you the result.

Presumably, the LLM user will have sufficient brain capacity to verify that the result works as they have imagined (however incomplete the mental picture might be). They then have an opportunity to tweak, in real time (of sorts), to make the output closer to what they want. Repeat this as many times as needed/time available, and the output gets to be quite sufficient for purpose.

This is how traditional, bespoke software development would've worked with contractor developers. Except with LLM, the turnaround time is in minutes, rather than in days or weeks.

soulofmischief
What's wrong with visual programming?
ashoeafoot
Information density is low, some concepts are hard to display and thus absent by design from the language, the limitations of the display abd debug framework become the limitation of all code executed with them..etc., etc. list goes on forever
soulofmischief
It's a structured way of programming that can invoke particular mental models when solving certain problems. You could for example see it as a scripting language for high-level tasks.

For example, look at what Blender and Unreal Engine do with visual programming. Or you could see how Max MSP or Processing make music and data manipulation intuitive. And you can still address lower-level or complex concerns with custom nodes and node groups.

Done correctly, you can hide a lot of complexity while having an appropriate information density for the current level of abstraction you're dealing with. Programming environments don't need to be one-size fits all! :)

platevoltage
Fast forward a couple decades and "Ok here it is. We call it Dreamweaver"
mrheosuper
pretty sure i use English in C program.
fuzztester
that was from 35 to 40 years ago.

today:

s/COBOL/SQL

and the statement is still true, except that many devs nowadays are JS-only, and are too scared or lazy as shit to learn another, relatively simple language like SQL. ("it's too much work". wtf do you think a job is. it's another name for work.)

because, you know, "we have to ship yesterday" (which funnily enough, is always true, like "tomorrow never comes").

SQL is straightforward enough, but its not the sketchy part. taking down the database so other people cant use it by running a test query is the bad part.

the explains are not nearly as straightforward to read, and the process of writing SQL is to write the explain yourself, and then try to coax the database into turning SQL you write into that explain. its a much less pleasent LLM chat experience

tempodox
But a sycophant with a hopeless case of Dunning-Kruger (a.k.a. LLM) is so much more entertaining!

This item has no comments currently.