This natural parse/validate activity seems natural in XML and make me wonder why this is not the case in JSON.
Remember what made JSON famous: web applications with a very rich user interface written in JavaScript. JSON was perfect for that because data from the server mapped directly on JavaScript objects; hit a GET URL with Ajax and get back an object, no need for parsing or anything else.
Those rich interfaces made JavaScript a very popular language and JSON obviously rode along and it’s popularity made it become the best candidate to overthrow “the angle bracket”.
But XML has a long history behind it. It’s a mature technology with lots of accompanying specifications. JSON is just catching up to those.
Although the draft specification expired in May 2011, the JSON Schema supporters think they have reached a pretty close to final version of the spec. So who knows what the future has in hold for JSON Schema.
I was surprised, to find only one php implementation […] Does IETF draft for JSON schema is a serious work, since only few implementation exist on the server side (PHP) ?
Does this PHP implementation validate JSON as per the last version of the JSON Schema draft? If yes, is there a need for other implementations? Do you need lots of implementations to certify a specification is serious? That’s just as saying that XSLT 2.0 is not serious because Microsoft didn’t bother to implement it.
As for your last question, incoming data needs to be validated. You don’t take a user request and throw it to the server and “hope it sticks”. You validate it; and JSON Schema is not the only way to validate JSON data, so don’t assume that web developers using JSON do not deeply test their incoming request data.
In conclusion, what I’m trying to say is that JSON can’t fully replace XML, or the other way around. So use each technology where appropriate.