Preferences

0xcde4c3db parent
> "be liberal in what you accept, and be conservative in what you send"

This is commonly known as Postel's Law, and comes from one of the TCP RFCs [1].

[1] https://en.wikipedia.org/wiki/Robustness_principle


This is also widely considered a bad idea now. Making liberal consumers allows for sloppy producers. Over time this requires new consumers to conform to these sloppy producers to maintain compatibility.

Just look at the clusterfuck that HTML5 has become. You need to have extremely deep pockets to enter that market.

_greim_
> Just look at the clusterfuck that HTML5 has become.

Ouch. I feel like this is kind of unfair. XML, HTML1-4, and HTML5 all differ in how they treat Postel's law. XML rejects it at the spec level; if you send garbage to a parser it bails immediately, which is nice. HTML5 embraces Postel's law at the spec level. If you send garbage to an HTML5 parser, there's an agreed-on way to deal with it gracefully. Also nice. The problem was rather with HTML1-4, which embraced Postel's law promiscuously, at the implementation level. There were specs, but mainstream implementations largely ignored them and all handled garbage input slightly differently. This is what created the afore-mentioned clusterfuck.

Yea this is absolutely what I meant. HTML5's complexity is a symptom of this problem.

I'm a bit worried about the authors taking this overboard and trying to redefine the URL standard with similar complexity.

erik_seaberg
HTML5 only provides the "be liberal in what you accept" error handling, they have never seen fit to write a "be conservative in what you send" grammar for authors and validators.
taeric
Do you have a survey or other citation for it being a bad idea? I get that it enables bad behavior, per see. However, the idea of rejecting a customer/client because they did not form their request perfectly seems rather anti customer.

Ideally, you'd both accept and correct. But that is the idea, just reworded.

shagie
taeric
The discussion was fun. And seems evenly split, at a quick reading.

More, I think it split on how you read it. If you view it as an absolute maxim to excuse poor implementations, it is panned. If you view it as a good faith behavior not to choke on the first mistake, you probably like it.

This is akin to grammar police. In life encounters, there is no real place for grammar policing. However, you should try to be grammatically correct.

still_grokking
> This is akin to grammar police. In life encounters, there is no real place for grammar policing. However, you should try to be grammatically correct.

That's because most humans have feelings. But most machines don't. So that's not comparable.

The problem with this idea is that different consumers might have a different subset of what they accept and correct.

If some of those become dominant, produces might start depending on that behavior and it becomes a de facto standard. This is literally what has happened to HTML, but holds true for many other Internet protocols.

If you're looking for some external reading, I found at least this:

* https://tools.ietf.org/html/draft-thomson-postel-was-wrong

I think you'll find few protocol designers arguing _for_ the robustness principle these days.

taeric
You'll also find few protocol designers designing anything as robust as the old protocols. :)

I mean, don't go out of your way to under specify input. But relatively nobody is going back to the heavy schema of xml over simple json. Even if they probably should.

I feel this is an anti fragile position. Try not to encourage poor input. But more importantly, be resilient to it. Not dismissive of it.

Robust as the old protocols? The early TCP protocols are very much underspecified. Maybe you have something else in mind.
taeric
Fair. I view them as what they grew into. Not as what they were initially designed as. Probably not a straight forward comparison.

I've just gotten weary of so many replacement protocols that get dreamed up and go nowhere. Often because they didn't actually learn all of the lessons from predecessors.

astrobe_
It goes against safety.

"Accept and correct" in the absence of ECC is just delusion if not hubris. The sender could be in a corrupted state and could have sent data it wasn't supposed to send. Or the data could have been corrupted during transfer, accidentally or deliberately. You can't know unless you have a second communication channel (usually an email to the author of the offending piece of software), and what you actually do is literally "guess" the data. How can it go wrong?

taeric
In the world of signed requests, but flips are less of a concern. If the signature doesn't match, reject the call. Which implies I clearly don't mean accept literally everything. Just work in your confines and try to move the ball forward, if you can. This is especially true if you are near the user. Consider search engines with the "did you mean?" prompts. Not always correct, but a good feature when few results are found.

For system to system, things are obviously a but different. Don't just guess at what was intended. But, ideally, if you take a date in, be like the gnu date utility and try to accept many formats. But be clear in what you will return.

And, typically, have a defined behavior. That could be to crash. Doesn't have to be, though. Context of the system will be the guide.

Boulth
Consumer is interested in fulfilling their need so they will fix their request so that it gets processed.
taeric
Or they will pick a service that works with them. This is literally how Google won most of their market share. Sure, there used to be a bit of syntax on the search, but Google always had a single input field and did not barf requests back to users because they put a field in the wrong input.
Boulth
This is about protocols and data formats not user input text box (the original post mentioned JSON). It's interesting that you bring Google that lead efforts to replace more "liberal" HTTP1.1 with binary, strict HTTP2.
taeric
My point is often that those that are pushing for stricter formats have good intentions. Strong arguments, even. However, what is required to grow the adoption of something is different than hardening it. And typically hardening something makes it brittle is some way. (Which is super risky at the adoption stage.)

And, of course, most people don't actually understand why they succeeded at something. It is easy to understand failure from a specific cause. It is much more difficult to understand success from a combination of many causes.

digi_owl
The HTML5 clusterfuck comes from having the biggest players being allowed to adjust the goal as they see fit, when they see fit (aka "living document").
rhapsodic
> Just look at the clusterfuck that HTML5 has become. You need to have extremely deep pockets to enter that market.

What do you mean by "enter that market"?

hyperdimension
I think they mean needing deep pockets to write a new browser, with all the complexity that modern HTML+JS entails.
projektfu
The good news is that Firefox and chromium have pretty open licenses so you only need to change what you want. Of course you need to grok it, which isn't trivial. But writing a browser hasn't been easy since... Cello?
Forking an existing browser ins't exactly writing a new browser, it's modifying one. That process very likely limits innovation in the space.
kbouck
Here are some thoughtful arguments against "Postel's Law":

https://tools.ietf.org/html/draft-thomson-postel-was-wrong-0...

This item has no comments currently.