Telling people to "just" take actions that decrease the reliability and the rigor of their data because of...vogons?...is one of those weird middlebrow things that HN tends to try to steer clear of, last I checked.
(edit: To be clear, I get the reference, I think it's a silly one both for the childish regard the poster to whom I am replying has for other people and textually because it doesn't even hang.)
I guess what I am saying is JSON was created for simplicity and needs no updates.
XML has already been created and other formats like BSON, YAML etc or create a new one that suits more detailed needs.
The sole reason that JSON is so successful is it has fought against 'vogon' complication and bureaucracy that riddled XML and many binary formats of the past. JSON is for the dynamic, simple needs and there are plenty of other more verbose other formats for those needs. JSON works from the front-end to the back-end and there are some domain specific ways to store data that is more complex without changing the standard or if that doesn't work, move to another format. The goal of many seem to be to make JSON more complex rather than understand that it was solely created for simplicity. If it is already hard to parse it will be worse when you add in many versions of it and more complexity.
I also find it interesting that we seem to be circling back to binary and complex formats. HTTP/2 might be some of the reason this is happening and big tech turns away from open standards.
Binary formats lead to bigger minefields if they need to change often. Even when it comes to file formats like Microsoft Excel xls for example, those are convoluted and they were made more complex than needed leading Microsoft themselves to create xlsx which is XML based and even still it is more complicated than needed. Microsoft has spend lots of money on version convertors and issues with it due to their own binary choices and lockin [1].
> As Joel states, a normal programmer would conclude that Office’s binary file formats:
- are deliberately obfuscated
- are the product of a demented Borg mind or vogon mind
- were created by insanely bad programmers
- and are impossible to read or create correctly.
Binary that has to change often that is a data/storage format will be eventually convoluted because it is easier to just tack on something randomly to the end of the bin than think about structure and version updates. Eventually it is a big ball of obfuscated data. JSON and XML are at least keyed, JSON being more flexible than XML and binary to changes and versioning.
Lots of the move to binary is reminiscent of reasons before that led to lock-in, ownership and because some engineer needed to put in more complexity for those ends.
There are good and bad reasons to use all formats, if JSON doesn't suit your need for numeric precision or length and you can't store it a bigint for instance as a string with a type key describing it is a big int, maybe JSON isn't the format for the task.
Though SOAP was probably created by vogons straight up primarily as lock-in as WSDL and schemas/dtds never really looked to be interoperable but was looking to own the standard by implementing complexities with embrace, extend, extinguish in mind. SOAP and overcomplexity is the reason that web services were won by JSON/REST/HTTP/RPC as it was overcomplicated.
JSON is Javascript Object Notation and it was created for that reason, because it is so simple the usage spread to apis, frontends, backends and more. People trying to add complexities breaks it for the initial goal of the format.
JSON won due to simplicity and many want to take away that killer feature. Keeping things simple is what the best programmers/engineers do and it is many times harder than just adding in more complexity.
[1] https://www.joelonsoftware.com/2008/02/19/why-are-the-micros...
The real question is: with the benefit of hindsight, could you define a better but similarly simple format?
Would an alternative to JSON that specified the supported numeric ranges be less simple? Not really. Would it be better? Yes. The current fact that you can try to represent integers bigger than 2^53, but they lose data, makes no sense except in light of the fact that JSON was defined to work with the quirks of JavaScript.
It's true that different tools are adapted for different uses. But sometimes one tool could have been better without giving up any of what made it useful for its niche.
I think the only answer to that question is to build it separate from JSON if you think it can be better, if it is truly better it will win in the market. There is no reason to break JSON and add complexity to the parsing/handling. It is 10x harder to implement simplicity than a format that meets all your needs that ultimately adds complexity.
The problem is when people want to add complexities to JSON. There is nothing stopping anyone from adding a new standard that does do that. But I will argue til the end of time that JSON is successful due to simplicity not edge cases.
Everything you mention can be implemented in JSON just as a string with type info, just because you want the actual type in the format might be the problem, it doesn't fit the use case of simplicity over edge cases. Your use case is one of hundreds of thousands people want in JSON.
> But sometimes one tool could have been better without giving up any of what made it useful for its niche.
Famous last words of a standards implementer. JSON wasn't meant to be this broad, it reached broad acceptance largely because for most cases it is sufficient and simplifies data/messaging/exchange of data. There are plenty of other standards to add complexity or build your own. You use JSON and like it because it is simple.
The hardest thing as an engineer/developer is simplifying complex things, JSON is a superstar in that aspect and I'd like to thank Crockford for holding back on demands like yours. Not because your reasons don't hold value, they do, but because it is moving beyond simplicity and soon JSON would be the past because it will have been XML'd.
In my opinion JSON is one of the best simplifications ever invented in programming and led to tons of innovation as well as simplification of the systems that use it.
If people make JSON more complex we need a SON, Simple Object Notation that is locked Crockford JSON and any dev that wants to add complexity to it will forever be put in the bike shedding shed and live a life of yak shaving.
JSON is a data and messaging format meant to simplify. If you can't serialize/deserialize to/from JSON then your format might be too complex, and if it doesn't exactly fit in JSON just put the value in a key and add a 'type' or 'meta' key that allows you to translate to and from it. If binary store it in base64, if it is a massive number put it in a string and a type next to it to convert to and from. JSON is merely the messenger, don't shoot it. JSON is so simple it can roll from front-end to back-end where parsing XML/binary in some areas is more of a pain especially for third party consumers.
JSON being simple actually simplifies systems built with it which is a good thing for engineers that like to take complexity and make it simple rather than simplicity to complexity like a vogon.