Preferences

Do you have a survey or other citation for it being a bad idea? I get that it enables bad behavior, per see. However, the idea of rejecting a customer/client because they did not form their request perfectly seems rather anti customer.

Ideally, you'd both accept and correct. But that is the idea, just reworded.


shagie
taeric OP
The discussion was fun. And seems evenly split, at a quick reading.

More, I think it split on how you read it. If you view it as an absolute maxim to excuse poor implementations, it is panned. If you view it as a good faith behavior not to choke on the first mistake, you probably like it.

This is akin to grammar police. In life encounters, there is no real place for grammar policing. However, you should try to be grammatically correct.

still_grokking
> This is akin to grammar police. In life encounters, there is no real place for grammar policing. However, you should try to be grammatically correct.

That's because most humans have feelings. But most machines don't. So that's not comparable.

taeric OP
I meant that grammar policing does little to help the exchange of information. Feelings aside.
A lack of enforced language standards does end up producing a language full of difficult-to-learn inconsistencies, though.
taeric OP
It makes difficult to codify inconsistencies. Most aren't that difficult to learn, oddly. Especially if you are just trying to be conversational.

Edit: I'm specifically going off evidence of teaching my kids. They have basically picked up language completely by talking to us. Even pronouns, adjectives, adverbs, etc. What they have not learned, is the reasons some words are used when another could have worked.

The problem with this idea is that different consumers might have a different subset of what they accept and correct.

If some of those become dominant, produces might start depending on that behavior and it becomes a de facto standard. This is literally what has happened to HTML, but holds true for many other Internet protocols.

If you're looking for some external reading, I found at least this:

* https://tools.ietf.org/html/draft-thomson-postel-was-wrong

I think you'll find few protocol designers arguing _for_ the robustness principle these days.

taeric OP
You'll also find few protocol designers designing anything as robust as the old protocols. :)

I mean, don't go out of your way to under specify input. But relatively nobody is going back to the heavy schema of xml over simple json. Even if they probably should.

I feel this is an anti fragile position. Try not to encourage poor input. But more importantly, be resilient to it. Not dismissive of it.

Robust as the old protocols? The early TCP protocols are very much underspecified. Maybe you have something else in mind.
taeric OP
Fair. I view them as what they grew into. Not as what they were initially designed as. Probably not a straight forward comparison.

I've just gotten weary of so many replacement protocols that get dreamed up and go nowhere. Often because they didn't actually learn all of the lessons from predecessors.

astrobe_
It goes against safety.

"Accept and correct" in the absence of ECC is just delusion if not hubris. The sender could be in a corrupted state and could have sent data it wasn't supposed to send. Or the data could have been corrupted during transfer, accidentally or deliberately. You can't know unless you have a second communication channel (usually an email to the author of the offending piece of software), and what you actually do is literally "guess" the data. How can it go wrong?

taeric OP
In the world of signed requests, but flips are less of a concern. If the signature doesn't match, reject the call. Which implies I clearly don't mean accept literally everything. Just work in your confines and try to move the ball forward, if you can. This is especially true if you are near the user. Consider search engines with the "did you mean?" prompts. Not always correct, but a good feature when few results are found.

For system to system, things are obviously a but different. Don't just guess at what was intended. But, ideally, if you take a date in, be like the gnu date utility and try to accept many formats. But be clear in what you will return.

And, typically, have a defined behavior. That could be to crash. Doesn't have to be, though. Context of the system will be the guide.

Boulth
Consumer is interested in fulfilling their need so they will fix their request so that it gets processed.
taeric OP
Or they will pick a service that works with them. This is literally how Google won most of their market share. Sure, there used to be a bit of syntax on the search, but Google always had a single input field and did not barf requests back to users because they put a field in the wrong input.
Boulth
This is about protocols and data formats not user input text box (the original post mentioned JSON). It's interesting that you bring Google that lead efforts to replace more "liberal" HTTP1.1 with binary, strict HTTP2.
taeric OP
My point is often that those that are pushing for stricter formats have good intentions. Strong arguments, even. However, what is required to grow the adoption of something is different than hardening it. And typically hardening something makes it brittle is some way. (Which is super risky at the adoption stage.)

And, of course, most people don't actually understand why they succeeded at something. It is easy to understand failure from a specific cause. It is much more difficult to understand success from a combination of many causes.

This item has no comments currently.