Preferences


hyperpape
While this doesn't settle any of our debates, it's interesting to note that there's a real question about whether today's debates have anything to do with what Postel originally meant: http://www.cookcomputing.com/blog/archives/000551.html.

The robustness principle is so compressed that it invites the reader to project an interpretation onto it.

StephenFalken
Great point.

The original Usenet comp.mail.pine newsgroup post [1] by Mark Crispin (father of the IMAP protocol):

  This statement is based upon a terrible misunderstand of Postel's
  robustness principle. I knew Jon Postel. He was quite unhappy with
  how his robustness principle was abused to cover up non-compliant
  behavior, and to criticize compliant software.

  Jon's principle could perhaps be more accurately stated as "in general,
  only a subset of a protocol is actually used in real life. So, you should
  be conservative and only generate that subset. However, you should also
  be liberal and accept everything that the protocol permits, even if it
  appears that nobody will ever use it."
[1] https://groups.google.com/d/msg/comp.mail.pine/E5ojND1L4u8/i...
ironick
I don't give this reminiscence ANY credit. The very first version of Postel's law makes it VERY clear that Postel intended his law to deal with non-compliant behavior in a tolerant or "liberal" way: "In general, an implementation should be conservative in its sending behavior, and liberal in its receiving behavior. That is, it should be careful to send well-formed datagrams, but should accept any datagram that it can interpret (e.g., not object to technical errors where the meaning is still clear)."

See my potted history of Postel's law: http://ironick.typepad.com/ironick/2005/05/my_history_of_t.h...

hyperpape
Nice to see that so well documented: I'd only seen the one most common reference, RFC 793.

That said, it's still unclear how far this extends: the example given is of an unknown error code, which might lead you to think that the requirement is "syntactically well-formed input where you can't 100% determine the semantics." That's a far cry from the way browsers handle malformed HTML. Similarly, you have to apply some judgment concerning what an agent can interpret the meaning of.

pierrebai
I have a hard time not to be sarcastic about the author's naivete.

He does acknowledge that Postel's Maxim might be essential to any widely deployed protocol that wants to be successful. He also acknowledges that his alternative is inaplicable to the early life of a protocol.

The main two flaws in tghe reasoning is that incompatibility or bugs are not intentional and that success is contingent on something 'just working'. For a thousand-feet views, you want errors, whatever their source, to propagate as little as possible and affect as little of a network as possible. Postel Maxim provides that effect. Being strict ensures that some process somewhere over which you have no control will affect your system.

Fortunately, it's being applied everywhere, notwistanding purists. Your house electrical input gets filtered and aim to provide a standard volatge. Your computer power supplies filters that and aims to provide a stable voltage and amps. Your electronics are surrounded by capacitors... wand it goes up the stack. It's just good engineering.

gedejong
Fortunately, Postel's law is not applied everywhere. Electrical deviations are formally specified so engineers design smaller error bounds, resulting in lower cost of the end-product (or lower weight and size). Your computer power supply will still produce magic smoke when connected to 330V or when subjected to strong fluctuations. The 'liberal' part is actually the 'strict' part: your electricity provider is obliged to provide you with an electric potential within certain boundaries.

The reasoning of the author is simple: we want a POC of an idea ASAP (lacking formal specifications of anomalies and error bounds) and when successful, error bounds and boundary conditions including specifications thereof should be communicated and implemented. That seems like a cogent and professional point to make, given the complexity of our systems.

jacquesm
Plenty of computer power supplies will produce magic smoke at 175V too! They interpret 'substantially less than 240V' as 'switch to 115V mode' and then happily blow up with the excess voltage as input.
Animats
I've argued in the past for an intermediate position, especially for HTML. Browsers should be moderately tolerant of bad HTML. But rather than trying to handle errors invisibly, they should revert to a simplified rendering system intended to get the content across without the decorative effects. After the first error, a browser might stop processing further Javascript, display a red band indicating defective HTML, and display all text in the default font. It might also report the error to the server in some way.

Read through the error-recovery specification for HTML5. It's many pages of defined tolerance for old bugs. Then read the charset-guessing specification for HTML5, which is wildly ambiguous. (Statistical analysis of the document to guess the charset is suggested.) The spec should have mandated a charset parameter in the header a decade ago. If there's no charset specification, documents should render in ASCII with hex for values > 127.

You've got two browsers to chose from. One that handles every site you visit without a problem, one which throws a bunch of obscure error messages on about 20% of the sites you visit.

Which do you think most people will chose?

femto113
I think this would have been great if done from the beginning, but even in early versions of Mosaic malformed HTML would still appear "correct" visually, and since once it looked ok most people figured it was ok we've been buried under broken HTML from the beginning. The idea that the browser "handles every site without a problem" is slightly misleading though, since even if everything looks ok the user is paying a price of lower performance and a slower pace of innovation as browser developers devote huge amounts of time, money, and attention to not puking on all that broken HTML.
To a large degree this is it. Nobody bats an eye if a misplaced quote somewhere in a Python program causes the whole program to fail to start, but XHTML breaking pages on syntax errors was considered a terrible idea because the old way worked fine(tm).

However, Python source code is not typically dynamically generated, while HTML is, increasing the probability of errors the site author could not trivially predict and the user can do nothing about.

Yeah, shouldn't apply the punishment to the user. It should do a best effort render and then DDOS the site. /s
Animats
Good idea. Another version: if there are any errors in the HTML, the browser blocks all ads and trackers. Bad HTML would be fixed so fast...
bcoates
Unfortunately software of all sorts has a pathological enthusiasm for adding defaulted, wrong metadata to everything. (look into medical charting and drug-dispensing software sometime if you're looking for a cheap scare).

Character-set and language tags are useless in practice, even the dumbest heuristics defeat them. Statistical analysis is so effective that encoding metadata should be forbidden, not required.

datenwolf
groan

"Fail early and hard, don't recover from errors" is a recipe for disaster.

That principle applied to critical systems software engineering leads to humans getting killed. E.g. in aerospace the result is airplanes falling out of the sky. Seriously. The Airbus A400M that recently crashed in Spain did so, because somewhere in the installation of the engine control software the control parameter files were rendered unusable. The result was, that the engine control software did fail hard, while this would have been a recoverable error (just have a set of default control parameters hardcoded into the software putting the engines into a fail safe operational regime); instead the engines shut off, because the engine control software failed hard.

In mission and life critical systems there are usually several redundant core systems and sensors, based on different working principles, so that there's always a workable set of information available. Failing hard renders this kind of redundancy futile.

No, Postel's Maxim holds as strong as ever. The key point here is: "Be conservative in what you send", i.e. your implementation should be strict in what it subjects other players to.

Also being string in what's expected can be easily exploited to DoS a system (Great Firewall RST packets anyone?)

jcranmer
You're missing the point. The problematic aspect of Postel's Maxim is not "be conservative in what you send" but "be liberal in what you accept"--as the draft points out, being liberal in what is accepted tends to cause people to become liberal in what they send and prevents implementers from tightening what they accept. HTML is the best poster child for the detrimental effects this causes (HTML5 mandates how you have to parse tag soup, since any HTML parser that wants to work has to parse tag soup that way since it is very much used in practice), but HTTP and MIME are easily in similar boats.

The point of the draft is best summarized as "if you can detect that the other side has a problem in its implementation, raise red flags early and noticeably." It's not safe to recover to some default, because that can make you think that things are working when they're not--imagine if the engine control software defaulted to assuming a different type of engine than what existed. The resulting confusion could equally destroy the engines; this is similar to what happened to the Ariane 5 rocket that caused it to explode.

E.g. in aerospace the result is airplanes falling out of the sky. Seriously.

You're misinterpreting 'fail fast' - it doesn't mean 'entire system should fail catastrophically at slightest problem' or 'systems should not be fault-tolerant'. It just means that components should report failure as soon as possible so the rest of the system can handle it accordingly instead of continuing operation with an unrecognized faulty component leading to unpredictable outcomes.

In particular, "Just Sort Of Keep Going And Hope It All Works Out Eventually" is also a great way to crash planes and kill people, if we're going to pretend that everything's safety critical.
ams6110
"One size fits all" is also a recipe, if not for disaster, then for unnecessary overengineering.

Fail hard and don't recover is absolutely fine in many scenarios, especially ones where no lives or expensive property are on the line.

Control software for jet engines is a whole different kettle of fish from sharing photos online. I would dare say most of us here have never worked on software that critical. The approach---from design to implementation to testing---is formalized to a degree most of us in the "agile" world of web apps could not tolerate.

aidenn0
It is a good maxim for internetworked software where the cost of failing early is much lower than the cost of leaking information due to buggy error code.
That's fine if you make the assertion, explicitly not made here, that you have some control over who sends you things.
nabla9
I have always felt that Postel's Maxim combined with network effect leads to complications in long term while it promotes interoperability in the short term.

It's game theoretically successful strategy to get your implementation to work with everyone. When you accept sloppy input, this allows sloppy implementations to become popular.

Eventually de facto protocol becomes unnecessarily complicated and you need to understand quirks in popular implementations.

TeMPOraL
Like the equivalent rule for human interaction, this is a coordination problem and it fails in the same way - when you're nice to people, you get overrun by assholes. People who don't care about being "conservative in their sending behaviour". This rule is really great if you can make everyone stick to it. In a competitive environment, it's impossible without some additional way to ensure assholes will be punished.
kabdib
Basically, we shouldn't issue standards or RFCs without test vectors and tests that are meaningful, and updated if bugs are found in them (or the RFC).

Expecting someone to (say) read the HTTP spec and write a compliant implementation without tests that everyone else is using as well is lunacy, and leads to the nightmare we have today.

Standards without engineering to back them up are bad.

Side effect: Committees that produce "ivory tower" standards that are unimplementable will find that their work is ignored.

Another side effect: Standards will get simpler, because over-complex nonsense will be obvious once the committee gets down to making an exemplar actually work.

Not that it will ever happen...

[I helped write an HTTP proxy once. The compliant part took a couple weeks; making it work with everyone else's crappy HTTP implementation was a months-long nightmare on rollerskates]

InclinedPlane
What is missing from this is the ability to toggle a "strict mode" with a browser, or even a "reference implementation browser". Right now developers have to rely on their knowledge alone combined with the simple fact of whether or not their work seems to render correctly on the major browsers. That is where you run afoul of Postel's Maxim, because it should be possible to use these tools (like browsers) in a way that provides the sort of "hey, don't do that" feedback during development even though during normal operation the browser should do its best to make due with whatever it's given.

That same pattern exists elsewhere too, so often people need to do "API science" to figure out how to use various tools. With the common result being the discovery of how to use those tools seemingly effectively, but incorrectly.

sytelus
I think both of these philosophies represent extremes.

This means Forgive all mistakes: Be liberal in what you accept, and conservative in what you send

This means forgive no mistakes: Protocol designs and implementations should be maximally strict.

I would suggest an alternative, Forgive most mistakes, but always let them know they did made mistake:

Be conservative in what you send, and as liberal as possible in what you accept but always let them know what they could have done better

eridius
Does that solve interoperability though? If you accept it and let them know, you're still accepting it, meaning you're still going to tolerate that protocol implementation continuing its bad behavior in the future. And if the protocol implementation becomes widely-used, then we have the exact same maintenance issue as we do with Postel's Maxim.
jmount
My take on this: http://www.win-vector.com/blog/2010/02/postels-law-not-sure-...

Though I am beginning to think it coming down to "do you want to make it easy to run on a dev-box or easy to run in production?"

poofyleek
Being liberal in acceptance did not discount the conservatism in output. If diligently only one side of the maxim is followed, there would be less issues. For example, if many implementations were conservative in sending, that alone is sufficient. After that allowing for liberal handling of input is to compensate for those who did not follow the conservatism in output. Postel can't be easily blamed for long term entropic outcomes, accelerated by the abundance of implementations that ignore both sides of the maxim.
"Conservative in what you accept" works fine if the first implementation shipped is complete and bug-free. If you have two implementations that have incompatible bugs then third parties have to detect who to be bug-compatible with. If your system is not backwards compatible (easily achieved in HTML by ignoring new elements), then you have to do version detection as well.
protomyth
So, if the original spec for HTML rejected ill formed pages and enforced nesting and end tags, would the web be better today?

I think the lack of formality actually hurt non-technical users because it made tool harder to program.

walshemj
But you then end up with OSI instead of TCP/IP and the internet would look very different.

This item has no comments currently.