I don't believe in the spec at all, because as pointed out in the article, the spec is not precise enough. The spec also puts undue burden on the parser.
The spec should absolutely say whether the first or last key has precedence when duplicate keys appear. I don't think the spec should simply demand the parser to error on duplicate keys as that isn't helpful. If such an error exists, there should still be a way to parse the structure and get the data. A duplicate key shouldn't cause complete parse failure. It is recoverable.
To resolve the problem with people placing comments into dummy values ( or duplicate keys ), the JSON spec really needs to implement a feature set key / flag key. Something like:
{
"_flags":["comments"],
...
}
This would allow for extensions to be reasonably supported aside from the fact that none of the parsers will initially do anything with this extra key.
Another flag would be "plainstrings", which would tell the parser NOT to parse escapes in any way and just deliver string data as is. This is what my parser does, as there are too many pitfalls in parsing escapes consistently, and you then also have to code in how to write those escapes back out. This may be passing the back onwards, but I view it as division of responsibilities. Why should the parser be responsible for and required to understand unicode and all its complexities?
Another flag would be "type.[some type]" to indicate presence of a type and the need to be able to parse it. The way I implemented this in my parser is this:
{
key: [type name].[type representation]
}
For example, hex data:
{
key: x.4FE310B2
}
These sorts of things would address the issues brought by the article.
Who am I, though, kidding... Cooperation doesn't exist in the community, no one will change their parsers to do as I suggest, and I'm likely to get downvoted just sharing my ideas. Carry on uncaring world.
out of curiosity, how you manage to get the first key, instead of the last? Because it seems to me, it would be less logic/ easier for a parser to always return the last.
First key comes up with a 3rd party performance library for go.. If you are parsing a stream, etc, it is always possible to have situations where you can stop reading when you get a first answer, but then you don't really know if the stream is valid JSON at all.
I don't believe in the spec at all, because as pointed out in the article, the spec is not precise enough. The spec also puts undue burden on the parser.
The spec should absolutely say whether the first or last key has precedence when duplicate keys appear. I don't think the spec should simply demand the parser to error on duplicate keys as that isn't helpful. If such an error exists, there should still be a way to parse the structure and get the data. A duplicate key shouldn't cause complete parse failure. It is recoverable.
To resolve the problem with people placing comments into dummy values ( or duplicate keys ), the JSON spec really needs to implement a feature set key / flag key. Something like: { "_flags":["comments"], ... }
This would allow for extensions to be reasonably supported aside from the fact that none of the parsers will initially do anything with this extra key.
Another flag would be "plainstrings", which would tell the parser NOT to parse escapes in any way and just deliver string data as is. This is what my parser does, as there are too many pitfalls in parsing escapes consistently, and you then also have to code in how to write those escapes back out. This may be passing the back onwards, but I view it as division of responsibilities. Why should the parser be responsible for and required to understand unicode and all its complexities?
Another flag would be "type.[some type]" to indicate presence of a type and the need to be able to parse it. The way I implemented this in my parser is this: { key: [type name].[type representation] }
For example, hex data: { key: x.4FE310B2 }
These sorts of things would address the issues brought by the article.
Who am I, though, kidding... Cooperation doesn't exist in the community, no one will change their parsers to do as I suggest, and I'm likely to get downvoted just sharing my ideas. Carry on uncaring world.