On quality and standards: The W3C

Over in his blog, Arnaud Le Hors has an excellent piece called “A Standards Quality Case Study: W3C.”

With all the fuss about ISO recently, I think people need to remember that important, and sometimes more important, standards work is taking place in standards groups like the W3C, OASIS, IEEE, OMG, and the OAGI. That is, in my opinion, the ultimate stamp of quality and acceptance need not be from the ISO or one of the other I** organizations. It can be but today that evidently is not automatic.

I know that procurement laws may require international standards in some cases. Maybe it’s time to revisit those laws and instead have them relate to quality, openness, and transparency rather than historical working arrangements.


This entry was posted in Standards and tagged , , , , , . Bookmark the permalink.

7 Responses to On quality and standards: The W3C

  1. Alex Brown says:

    Hi Bob

    I don’t think anybody (who knew of them in the first place) will have forgotten standards bodies like W3C or OASIS. Indeed for those of who work with XML, the W3C is of course the central source of most of the key specifications.

    Surely though quality is not an automatic facet of any particular body’s work, but varies according to many factors: the people, the time, the politics, etc. So while W3C has given us some great technologies (XML 1.0, XSLT, MathML, and SVG – to name but four) it has also given us some stinkers (e.g. XML Schema, the whole WS-* stack, and XML 1.1).

    I think it’s interesting that often the blame for “stinkiness” can be traced squarely back to vendor influence. To take one tiny example: why did the W3C decide to count the (previously forbidden) NEL character as a line feed in XML 1.1, other than for reasons of compatibility with legacy IBM systems which (practically alone of their competitors) made use of this character? This was one of the disastrous moves that made such XML 1.1 instances incompatible with the entire installed base of XML 1.0 processors.

    XML 1.0 (which I think of as a clean, well-written spec) has attracted over 200 errata in its lifetime. At around 40 pages that’s 5 errors per page. Do you think certain recent high-profile ISO/IEC standards are significantly more faulty than that?

    When you mention procurement, I take it you mean the procurement by nations. The major factor here is surely that nations lean towards international standards because they are international, not necessarily because of perceptions of superior quality. Being international means that they (the nations) ultimately can control the standardisation process. Vendor-driven consortiums perform a different function and are valued at a lesser worth accordingly: it’s not technical, it’s political.

    And if laws are to be re-visited and standards bodies judged, who is going to be doing the re-visiting and the judging? Ultimately it is a precept of international standardisation that the sovereign nations order their own affairs and yes – sometimes this means vendors get upset. Ultimately we (the users) need the nations as they are the only entities powerful enough to bring today’s huge corporations to heel.

    - Alex.

  2. Bob Sutor says:

    @Alex: I hardly agree with you that the WS-* standards are stinkers. Moreover, a “certain recent high-profile ISO/IEC standard” has literally become the paradigm for low technical quality, extra-process political maneuvering, and abuse of standards intent. Oh, how very much of which to be proud.

    As to bringing vendors to heel, nice try, but as you know many of the participants in the standards bodies in nations are vendors, or their partners, or their allied consultants. And when the work of those people, many of whom are trying to be independent and unbiased, is overruled at the last minute by people previously unconnected with the process, we don’t have sovereignty, we have downright confusion. We indeed need some real improvements in some national body efforts to bring transparency, openness, and less ambiguity of process to bear.

    Comparing XML 1.0 to OOXML is just downright insulting to the creators of that former standard, in my opinion, no matter how much supporters of OOXML may now want to justify their actions.

  3. Chris Ward says:

    In a previous era, we communicated with SNA. Nowadays, we communicate with TCP.

    Why the change ? Is it because ‘the world’ decided to quit using a standard defined by a single vendor, and prefer to use a standard defined by agreement amongst engineers from multiple vendors and users ?

    SNA still works; and works very well. But hardly any new projects exploit it. Can you find a public Internet Service Provider who offers SNA ? I think not.

    I’m sure it would be more lucrative for IBM if everyone had to use SNA. But it’s not going to happen. Open standards displace proprietary ones.

  4. Actually, the change in handling of the NEL character in XML 1.1 is the wrong thing to blame if you have any grudge against XML 1.1. XML 1.1 is all about extending the set of allowed characters so this particular one made no real difference in this regard. If anything the main problem with XML 1.1 comes from the incompatibility with 1.0 that was created by forbidding in 1.1 characters that were allowed in 1.0.

  5. Darren Bell says:

    Well, myself and a couple of co-workers have been following the story around OOXML and standards in general and we all now come to the same conclusion. That is that the ISO no longer standards for decent standards.

    The company I work for participates in the international community on standards in the audio arena and i can say that we now longer hold the ISO in such high regard as we used to. I’m afraid that the damage has been done and it will take a very long time to repair it.

    As mentioned by Bob, the W3C and OASIS etc will be the focus now. ISO just made itself nearly irrelevant.

    Just a question though. For OOXML to be adopted by governments, does there have to be at least 1 or two products fully implementing the standard. How long before we can deprecate the standard through non-implementation?

  6. Alex Brown says:

    Arnaud hi

    The introduction of NEL is not the only problem with XML 1.1, you’re right. But it is most certainly *a* problem in that the whitespace-processing of an XML document will differ according to its version. An excellent analysis of this (and other XML 1.1 issues) can be found in a free excerpt from Elliotte Rusty Harold’s (must read) book _Effective XML_ at http://www.cafeconleche.org/books/effectivexml/chapters/03.html

    I’m not sure it’s healthy to have a “grudge” against a technical specification (though viewing them, or presenting them, in emotional terms seems to have been fashionable lately). The reason for mentioning XML 1.1 is not emotional, but as a useful corrective (at a micro level) to any idea that the output of the W3C always represents vendor-agnostic standards perfection. XML Schema exemplifies the same thing at a macro level. In my experience the outputs of all standards bodies (international or not) is very varied, in technical quality and in other ways … basing everything around perceptions about the 29500 project is really just single-issue politics, isn’t it?

    - Alex.

  7. Bob Sutor says:

    @Alex: Not single issue at all, nor just perceptions. People know a lot about what really went on and by whom, which may surprise you. While it has problems itself, it illustrates how much needs to get fixed in the “system” from top to bottom.

Comments are closed.