Preferences

So, 1) a public service, 2) with no authentication, 3) and no encryption? (http only??), 4) sent every single response with a token, 5) giving full admin access to every client's legal documents. This is like a law firm with an open back door, open back window, and all the confidential legal papers sprawled out on the floor.

Imagine the potential impact. You're a single mother, fighting for custody of your kids. Your lawyer has some documentation of something that happened to you, that wasn't your fault, but would look bad if brought up in court. Suddenly you receive a phone call - it's a mysterious voice, demanding $10,000 or they will send the documents to the opposition. Neither of them knows each other; someone just found a trove of documents in an open back door and wanted to make a quick buck.

This is exactly what a software building code would address (if we had one!). Just like you can't open a new storefront in a new building without it being inspected, you should not be able to process millions of sensitive files without having your software's building inspected. The safety and privacy of all of us shouldn't be optional.


but google told me everyone can vibe code apps now and software engineers should count their days... it's almost as if there's more stuff we do than just write code...
humans used open s3 buckets stuffed with text files of usernames, passwords, addresses, credit card numbers etc long before vibe coding was a thing.
And those humans would be looking for a new job or face other consequences. An AI model can merrily do this with zero consequences because no meaningful consequences can be visited upon it.

Just like if any human employee publicly sexually harassed his female CEO, he'd be out of a job and would find it very hard to find a new one. But Grok can do it and it's the CEO who ends up quitting.

Prediction: Vibe coding systems will be better at security in 2 years than 90% of devs.
Prediction: it won't.

You can't fit every security consideration into the context window.

90% of human devs are not aware of every security consideration.
You may want to read about agentic AI, you can for instance call an LLM multiple times with different security consideration everytime.
This will age badly
That’s why we make concrete measurable predictions.
Agreed, but "vibe coding will be better at security" is not one of them. Better by which metric, against which threat model, with which stakes? What security even means for greenfield projects is inherently different than for hardened systems. Vibe coding is sufficient for security today because it's not used for anything that matters.
It'll play a role in both securing and security research I'm sure, but I'm not confident it'll be better.

But also, you'd need to have some metrics - how good are developers at security already? What if the bar is on the floor and LLM code generators are already better?

Only if they work in a fundamentally different manner. We can't solve that problem the way we are building LLMs now.
AFAICT Filevine doesn't use AI programming: https://www.filevine.com/jobs/d64dfff5-e36f-4db6-adac-0fc082... There's no mention of AI there other than writing code to integrate AI pipelines.

I've seen a lot of job ads (Canva) lately that mandate AI use or AI experience, and as an AI company if they wanted that I think they would have put it in the ad.

For the record I think I may be fine with the insincerity of selling AI but not using it!

@grok all software engineers do is mindlessly turn specifications into code in one shot, right?!?
See also: HN told me that regulation is bad and this is why the EU is behind!
> it's almost as if there's more stuff we do than just write code..

Yes, but adding these common sense considerations is actually something LLMs can already do reasonably well.

In 90% of the cases. And if you don't know how to spot that other 10%, you are still screwed, cause someone else will found that (and you don't even need to be an elite black hat to find it).
What’s to say a human would catch this 10% either?
The salary you pay them, typically
Humans are pretty good at edge cases.
If you explicitly request it which means you need to know about it.
OpenAI can put that in the system prompt for their CTO-as-a-service once, and then forget about it.
Or you need to guess that it exists, or you need to scan for places it exists.
Clearly not
Basically what happened in the Vastaamo case in Finland [1]. Except of course it wasn't individual phone calls – it was mass extortion of 30,000 people at once via email.

[1] https://en.wikipedia.org/wiki/Vastaamo_data_breach

if I remember correctly the attacker got caught in such a silly way

he wanted to demonstrate that he indeed has the private data. But he fucked up the tar command and it ended up having his username in the directory names, a username he used in other places on the internet

http-only makes it also very easy to sniff for LE if they decide to. This allows them to get knowledge about cases. Like, they could be scanning it with their own AI tool for all we know. In a free country with proper LE, this would neither be legal nor happening. But I am not sure the USA is remaining one, given the leader is a convicted felon with very dubious moral standards.

The problem here however is that they get away with their sloppiness as long as the security researcher who found this is a whitehat, and the regular news won't pick it up. Once regular media pick this news up (and the local ones should), their name is tarnished and they may regret their sloppiness. Which is a good way to ensure they won't make the same mistake. After all, money talks.

All the big tech companies are in the news every week. Everybody knows how bad they are. Their names are tarnished and yet everyone is still using their junk and they face zero repercussions when fucking up. I dont think things in the media would do any harm.
In the news, sure. But negatively? I consider myself included in 'everyone', and I am not using junk from all the big tech companies. More than once, I've successfully quit using certain ones, and Signal has become much more popular in my country ever since Trump II took office. Meta had to change the name of their company (Facebook) since it had such a bad name, and Zuck started a charm offensive.
Can't even make the basic auth of the password is password123.
This is HN. We understood exactly what “exposed … confidential files” meant before reading your overly dramatic scenario. As overdone as it is, it’s not even realistic. A likely single mother is likely tiny potatoes in comparison to deep-pocketed legal firms or large corporations.

The story is an example of the market self-correcting, but out comes this “building code” hobby horse anyway. All a software “building code” will do is ossify certain current practices, not even necessarily the best ones. It will tilt the playing field in favor of large existing players and to the disadvantage of innovative startups.

The model fails to apply in multiple ways. Building physical buildings is a much simpler, much less complex process with many fewer degrees of freedom than building software. Local city workers inspecting by the local municipality’s code at least has clear jurisdiction because of where the physical fixed location is. Who will write the “building code”? Who will be the inspectors?

This is HN. Of all places, I’d expect to see this presented as an opportunity for new startups, not calls for slovenly bureaucracy and more coercion. The private market is perfectly capable of performing this function. E&O and professional liability insurers if they don’t already will be soon motivated after seeing lawsuits to demand regular pentests.

The reported incident is a great reminder of caveat emptor.

> Building physical buildings is a much simpler, much less complex process with many fewer degrees of freedom than building software.

I don't...think this is true? Google has no problems shipping complex software projects, their London HQ is years behind schedule and vastly over budget.

Construction is really complex. These can be mega-projects with tens of thousands of people involved, where the consequences of failure are injury or even death. When software failure does have those consequences - things like aviation control software, or medical device firmware - engineers are held to a considerably higher standard.

> The private market is perfectly capable of performing this function

But it's totally not! There are so many examples in the construction space of private markets being wholly unable to perform quality control because there are financial incentives not to.

The reason building codes exist and are enforced by municipalities is because the private market is incapable of doing so.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal