Preferences

Yep, because that's all AI is; a parrot. For well-defined problems it works great... BECAUSE THEIR WELL DEFINED.

Generative AI is only as good as the dataset that you give it, so for problems that exist in a heavily contrived and parameterized space (like leetcode-style problems) it works really well. But give it a novel problem with intertwining libraries and external dependencies along with custom type and structure definitions, it's going to fall flat.

Humans do something AI can't, and that's draw from experience to apply a solution on a novel problem. This is why I'm not terribly worried about AI coming for engineer jobs anywhere in the near future. I use ChatGPT all the time to write me small functions, generate regular expressions, etc. Basically all the drudgery.

I'd argue we've hit peak-AI at this point because from this point on, all datasets are going to be colored by AI-generated results. Generative-AI is now on a trajectory were it'll simply regress to the mean of knowledge.


How can you spell such a mistake in ALL CAPS and don't see it? ;)

I don't know about the rest of your comment, albeit I consider myself more in the non-Chomsky camp. This emergence thing seems a little more than an elaborate hoax to me.

Well, maybe they've just trained GPT-4 to wiggle my balls, when I ask it to analyse a poem that I wrote 15 years ago.

The local llm scene, regardless of this debate, is nevertheless the hottest topic IMHO atm.

Several finetunes have repeatedly blown my mind even in the past 2 weeks. On a Raspberry Pi.

AI can interface. Natural Language Processing. This is the first time in history humaity has such knowledge and technology.

It's basically C-3PO.

You make it sound as if (software) engineer jobs are constantly doing cutting edge innovation, and perhaps as if all engineers are capable of it. That doesnt match my experience. Most of it is a more or less smart combination of standard concepts: Something GPTs are already pretty good at and i'm sure they will improve further.
But we are, at least I am, almost always integrating or interacting with new libraries, APIs, new features , new behaviours. I might not be personally writing that code but I definitely need to be up to date and adapting all the time.
I came to the realization that I'm using gpt more and more for getting my boilerplate code and I'm starting to feel a bit guilty about it tbh.

Maybe I shouldn't, though I justify it because it allows me to focus on the bespoke bits of whatever I'm doing...

Feels a bit double edged to me still.

I never write boiler plate code , am I weird ?

Even when I do need some, I use swagger or copier templates or something similar.

If you’re using ChatGPT for scaffolding, I feel you’ve fucked up?

I'm not a developer mind you but devops. And while I reuse code regularly as well, when new projects come up, I'll ramp up with said boilerplate. Though thats typically in only required if it's something new.
If you think we've hit peak you're grossly underestimating the sheer volume of copyrighted books, manuscripts, screenplays, podcasts, movies, documents, history, and research papers that ChatGPT hasn't been trained on. There's a LOT more juice to squeeze still.
This is actually incorrect, there's not that much data left to train on. I remember reading an article about it, might have been one of Gwern's or something about Chinchilla scaling, but to produce an order of magnitude increase we need an order of magnitude more data and there just isn't that amount available.
I wonder if Regenerative AI would be a more suitable name.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal