> As the robustness principle goes [...]
"The Harmful Consequences of the Robustness Principle" https://tools.ietf.org/html/draft-thomson-postel-was-wrong-0...
I want to upvote this so many more times than I'm able to.
The principle is even more harmful, because it sounds so logical. If many JSON parser accept your JSON object which is not valid JSON, any new parser that doesn't accept it will be booed as a faulty parser.
If your input is not according to spec, throw an error. The sender is wrong. They deserve to know it and need to fix their output.
JSON parsers may also permit various additional variations which are very explicitly not JSON. This means that they may accept a handcrafted, technically invalid JSON document. However, a JSON encoder may never generate a document containing such "extensions", as this would not be JSON.
This concept of having parsers accept more than necessary follows best practices of robustness. As the robustness principle goes: "Be conservative in what you do, be liberal in what you accept from others".