"Before 1954, almost all programming was done in machine language or assembly language. Programmers rightly regarded their work as a complex, creative art that required human inventiveness to produce an efficient program."
-John Backus, "The History of Fortran I, II, and III", https://dl.acm.org/doi/10.1145/800025.1198345
"The IBM Mathematical Formula Translating System or briefly, FORTRAN, will comprise a large set of programs to enable the IBM 704 to accept a concise formulation of a problem in terms of a mathematical notation and to produce automatically a high speed 704 program for the solution of the problem."
-IBM, "Specifications for the IBM Mathematical FORmula TRANslating System, FORTRAN", http://archive.computerhistory.org/resources/text/Fortran/10...
"FORTRAN should virtually eliminate coding and debugging" https://www.hackerneue.com/item?id=3970011
But it still has been immensely useful and a durable paradigm, even though usage hasn't been exactly as thought.
Maybe a similar bifurcation will arise where there are vibe coders who use LLMs to write everything, and there are real engineers who avoid LLMs.
Maybe we’re seeing the beginning of that with the whole bifurcation of programmers into two camps: heavy AI users and AI skeptics.
Im more surprised by software engineers who do know these things than by the ones who don’t.
It’s not that SQL is hard, it’s that for any discipline the vast majority of people don’t have a solid grasp of the tools they’re using. Ask most tradespeople about the underlying thing they’re working with and you’ll have the same problem.
Reasons: - I can compose queries, which in turn makes them easier to decompose - It's easier to spot errors - I avoid parsing SQL strings - It's easier to interact with the rest of the code, both functions and objects
If I need to make just a query I gladly write SQL
It's just a shame that many languages don't support relational algebra well.
We had relations as a datatype and all the relevant operations over them (like join) in a project I was working on. It was great! Very useful for expressing business logic.
I share your sentiment though - I'm a data engineer (8 years) turned product engineer (3 years) and it astounds me how little SQL "normal" programmers know. It honestly changed my opinion on ORMs - it's not like the SQL people would write exceeds the basic select/filter/count patterns that is the most that non-data people know.
Is this true? It doesn't seem true to me.
Yes, there are so many so called developers in backend field of work who do not know how to do basic SQL. Anything bigger than s9imple WHERE clause.
I wouldn't even talk about using indexes in database.
(Also of other food, energy, and materials sourcing: fishing, forestry, mining, etc.)
This was the insight of the French economist François Quesnay in his Tableau économique, foundation of the Physiocratic school of economics.
Working the summer fields was one of the least desirable jobs but still gave local students with no particular skills a good supplemental income appropriate for whichever region.
Well, what we had before SQL[1] was QUEL, which is effectively the same as Alpha[2], except in "English". Given the previous assertion about what came before SQL, clearly not. I expect SQL garnered favour because it is tablational instead of relational, which is the quality that makes it easier to understand for those not heavy in the math.
[1] Originally known as SEQUEL, a fun word play on it claiming to be the QUEL successor.
[2] The godfather language created by Codd himself.
None of these had "semantic expressivity" as their strength.
> If SQL looked like, say, C#'s LINQ method syntax, would it really be harder to use?
Yes.
SQL is pretty horrifying when you start getting tree structures or recursive structures. While Datalog handles those like a champ.
SQL is really nice for columnar data, and it's well supported almost everywhere.
Though Datalog isn't half bad at columnar data either.
They are very much the exception that proves the rule though.
(I'd love for someone to substantiate or debunk this for me.)
Incorrect.
Encoding a program was considered secretarial work, not the act of programming itself. Over time, "encoding" was shortened to "coding."
This is why the industry term "coder" is a pejorative descriptor.
For some people some of the time. I don't think that's true in general.
Most people miss the fact that technical improvements increases the pie in a way that was not possible before.
When digital cameras became popular, everybody become a photographer. That only made the world better, and we got soo many more good photographers. Same with YouTube & creativity.
And same with coding & LLMs. World will have lots more of apps, and programmers.
I disagree with the "only" part here. Imagine a distribution curve of photos with shitty photos on the left and masterpieces on the right and the height at the curve is how many photos there are to be seen at that quality.
The digital camera transition massively increased the height of the curve at all points. And thanks to things like better autofocus, better low light performance, and a radically faster iteration loop, it probably shift the low and middle ends to the right.
It even certainly increased the number number of breathtaking, life-changing photos out there. Digital cameras are game-changes for photographic journalists traveling in difficult locations.
However... the curve is so high now, the sheer volume of tolerably good photos so overwhelming, that I suspect that average person actually sees fewer great photos than they did twenty years ago. We all spend hours scrolling past nice-but-forgottable sunset shots on Instagram and miss out on the amazing stuff.
We are drowning in a sea of "pretty good". It is possible for there to be too much media. Ultimately, we all have a finite amount of attention to spend before we die.
Meaning no disrespect to photographers, I'm starting to think that a probable outcome of all the AI investment is a sharp uptick in shovelware.
If we can get AIs to build "pretty good" things - or even just "pretty average" things - cheaply, then our app stores, news feeds, ad feeds, company directives, etc, will be continuously swamped with it.
I still find great value in the TCM cable channel. Simply because if I tune in at a random time, it's likely to be showing an excellent old film I either never heard of or never got around to watching.
The service they are offering is curation, which has a lot of value in an age of infinite content flooding our attention constantly.
The reason to take all 400 though as every once in a while one photo is obviously better than another for some reason. You also want several angles because sometimes the light will be wrong at the moment, or someone will happen to be in the way of your shot...
And sometimes it is even combining elements from different photos: Alice had her eyes closed in this otherwise great shot, but in this other shot her eyes were open. A little touch-up and we've got the perfect photo.
Did it?
people now stand around on dance floors taking photos and videos of themselves instead of getting on dancing and enjoying the music. to the point where clubs put stickers on phones to stop people from doing it.
people taking their phone out and videoing / photographing something awful happening, instead of doing something helpful.
people travel to remote areas where the population has been separated from humanity and do stupid things like leave a can of coke there, for view count.
it’s not made things better, it just made things different. whether that’s better or worse depends on your individual perspective for a given example.
so, i disagree. it hasn’t only made things better. it made some things easier. some things better. some things worse. some things harder.
someone always loses, something is always lost. would be good if more people in tech remembered that progress comes at a cost.
There are other types of dances where dancers are far more interested in the dance than selfies: Lindy Hop, Blues, Balboa, Tango, Waltz, Jive, Zouk, Contra, and West Coast Swing to name a few. Here are videos from the Blues dance I help organize where none of the dancers are filming themselves:
* https://www.facebook.com/61558260095218/videos/7409340551418...
It is pretty hard to break out but people still make names for themselves either from experience on assignments like the old days but also from instagram and other social media followings. People still need weddings shot and professional portraits taken which takes some skill in understanding the logistics of how to actually do that job well efficiently and managing your equipment.
Demand is way down because while a $5000 lens on a nice camera is better than my phone lens, my phone is close enough for most purposes. Also my phone is free, in the days of film a single roll of film by the time you developed it costs significant money (I remember as a kid getting a camera for my birthday and then my parents wouldn't get me film for it - on hindsight I suspect every roll of film cost my dad half an hour of work and he was a well paid software developer). This cost meant that you couldn't afford to practice taking pictures, every single one had to be perfect. So if you wanted a nice picture of the family it was best to pay a professional who because of experience and equipment was likely to take a much better one than you could (and if something went wrong they would retake for free).
This is actually bad for existing programmers though?
Do you not see how this devalues your skills?
A client of mine has gotten quite good at using Bolt and Lovable. He has since put me on 3 more projects that he dreamed up and vibe coded that would just be a figment of his imagination pre-AI.
He knows what's involved in software development, and knows that he can't take it all the way with these tools.
That's why sw devs salaries went up like crazy in our time and not in 1980.
But what new tech will we have, that will push the SW dev market demand up like internet connected PCs and smartphones did? All I see is stagnation in the near future, just maintaining or rewriting the existing shit that we have, not expanding into new markets.
When online flight bookings came about, travel agents were displaced. The solution isn't "let's stop online flight bookings sites and protect travel agents" because that's an inefficient system
Under capitalism, or late-stage capitalism, if you will, more efficient procedures aren't normally allowing for greater margins. There are countless examples of more exploitative and wasteful strategies yielding much greater margins than more efficient alternatives.
Not sure I agree. I haven't seen much evidence of "better photography" now that it's digital instead of film. There are a million more photos taken, yes, because the cost is zero. But quantity != quality or "better", and if you're an average person, 90% those photos are in some cloud storage and rarely looked at again.
You could argue that drones have made photography better because it's enabled shots that were impossible or extremely difficult before (like certain wildlife/nature shots).
One thing digital photography did do is decimate the photographer profession because there is so much abundance of "good enough" photos - why pay someone to take good ones? (This may be a lesson for software development too.)
Me: "I'm getting married on [date] and I'm looking for a photographer."
Them, in the voice of Nick Burns: "We're already filling up for next year. Good luck finding a photographer this year."
Me: "I just got engaged. You never have anything open up?"
Them: "No" and hang up the phone.
The faster guys like that struggle to make a living, the better.
If today all you do as a programmer is open jira tickets without any kind of other human interaction, AI coding agents are bad news. If you’re just using code as a means to build products for people, it might be the best thing that has happened in a long time.
I think you really missed the point of what these technologies and innovations actually did for society and how it applies to today, underneath the snark.
In the 1970's, if you got gifted a camera, and were willing to put in the work to figure out how to use it, you learned a skill that immediately put you in rare company.
With enough practice of that skill you could be a professional photographer, which would be a good , reliable, well paid job. Now, the barrier of entry is nothing, so it's extremely competitive to be a professional photographer, and even the ones that succeed just scrape by. And you have to stand out on other things than the technical ability to operate a camera.
That's...what's about to happen (if it hasn't already) with software developers.
Everyone in the 1970s was gifted a camera. Many of them got a nice SLR with better lens than a modern smart phone. Cameras were expensive, but within reach of most people.
Film was a different story. Today you can get 35mm film rolls for about $8 (36 pictures), and $13 to develop (plus shipping!), and $10 for prints (in 1970 you needed prints for most purposes, thought slides were an option), so $31 - where I live McDonalds starts you are $16/hour, that roll of film costs almost 2 hours work - before taxes.
Which is to say you couldn't afford to become skilled in 1970 unless you were rich.
But consider this— back in the day, how many mainframe devs ( plus all important systems programmer! ) would it take to conjure up a CRUD application?
Did you forget the vsam SME or dba? The CICS programming?
Today, one person can do that in a jiffy. Much, much less manpower.
That might be what AI does.
No, wait it was called natural language coding, now anyone can code.
No, wait it was called run anything self fixing code. No wait, simplified domain specific language.
No, wait it was uml based coding.
No, wait excel makros.
No, wait its node based drag and drop .
No, wait its LLMs.
The mental retardation of no code is strong with the deciding caste, every reincarnation must be taxed.
Presumably, the LLM user will have sufficient brain capacity to verify that the result works as they have imagined (however incomplete the mental picture might be). They then have an opportunity to tweak, in real time (of sorts), to make the output closer to what they want. Repeat this as many times as needed/time available, and the output gets to be quite sufficient for purpose.
This is how traditional, bespoke software development would've worked with contractor developers. Except with LLM, the turnaround time is in minutes, rather than in days or weeks.
For example, look at what Blender and Unreal Engine do with visual programming. Or you could see how Max MSP or Processing make music and data manipulation intuitive. And you can still address lower-level or complex concerns with custom nodes and node groups.
Done correctly, you can hide a lot of complexity while having an appropriate information density for the current level of abstraction you're dealing with. Programming environments don't need to be one-size fits all! :)
today:
s/COBOL/SQL
and the statement is still true, except that many devs nowadays are JS-only, and are too scared or lazy as shit to learn another, relatively simple language like SQL. ("it's too much work". wtf do you think a job is. it's another name for work.)
because, you know, "we have to ship yesterday" (which funnily enough, is always true, like "tomorrow never comes").
the explains are not nearly as straightforward to read, and the process of writing SQL is to write the explain yourself, and then try to coax the database into turning SQL you write into that explain. its a much less pleasent LLM chat experience
"Wow, show it to me!"
"OK here it is. We call it COBOL."