Some metrics for measuring the success of the OOXML BRM

A very strange thing happened at the beginning of September, 2007. Microsoft issued a press release following the close of balloting on OOXML that began

Today the International Organization for Standardization (ISO) released the results of the preliminary ballot to participating National Body members for the ISO/IEC DIS 29500 (Ecma 376 Office Open XML file formats) ratification process. The results show that 51 ISO members, representing 74 percent of all qualified votes, stated their support for ratification of Open XML. Along with their votes, the National Bodies also provided invaluable technical comments designed to improve the specification. Many of the remaining ISO members stated that they will support Open XML after their comments are addressed during the final phase of the process, which is expected to close in March 2008.

This was talking about a very successful occurrence, evidently, and there was reason for great joy around the world.

In their recent press release about the OOXML Ballot Resolution Meeting that is taking place in Geneva this week, the ISO spelled out what had happened a bit differently:

Approximately 3 500 comments were received during last year’s ballot. By grouping and by eliminating redundancies, these have been edited by SC 34 experts down to 1 100 comments for processing during the five days of the BRM.

The five-month ballot process which ended on 2 September was open to the IEC and ISO national member bodies from 104 countries, including 41 that are participating members of the joint ISO/IEC technical committee, JTC 1, Information technology.

Approval requires at least 2/3 (i.e. 66.66 %) of the votes cast by national bodies participating in ISO/IEC JTC 1 to be positive; and no more than 1/4 (i.e. 25 %) of the total number of national body votes cast to be negative. Neither of these criteria were achieved in the DIS vote, with 53 % of votes cast by national bodies participating in ISO/IEC JTC 1 being positive and 26 % of national votes cast being negative.

What? They had to achieve passing marks on two different criteria, yet both failed?

What should we expect in terms of statements by ECMA and Microsoft about the result of the BRM? Your guess is as good as mine, and it might be fun to write what you expect and compare it to what is published, if a press release is done.

I think it might be useful to have some real quantitative metrics to use to measure the success of the BRM. In particular, these might be useful to think about if your country sent a delegation to Geneva. Some of these use the same data, but in different ways.

To be clear, I fully believe that the convener and the national standards body delegations gave this their best efforts to try to make the BRM a success. I was not in the meeting, but these metrics are useful especially for those who did not attend.

In the following, I will use the phrase “fully discussed with consensus reached” to mean that when a comment suggested a problem with the OOXML specification, a discussion of all proposed resolutions took place in a full and balanced manner, and then consensus developed around whether a particular resolution should be adopted to modify the specification or leave it as-is.

  1. How many total comments were there and how many were fully discussed with consensus reached at the meeting?
  2. What percentage of the total comments were fully discussed with consensus reached this week? Was it closest to 10%, 20%, 30%, 40%, 50%,60%, 70%, 80%, 90%, or 100%?
  3. Conversely, what percentage of the total comments were voted on without a full discussion and consensus reached? Was it closest to 10%, 20%, 30%, 40%, 50%,60%, 70%, 80%, 90%, or 100%?
  4. Individual or closely related comments from countries were considered in country alphabetical order, cycling through as many times as necessary. How many full cycles were there?
  5. How many times during the week did your country get an opportunity to begin a full discussion on a new comment?
  6. What was the percentage of all your comments that were fully discussed with consensus reached during the week?
  7. Of the comments that were fully discussed with consensus reached, how many of these were simple editorial corrections? What percentage of all the comments that were fully discussed with consensus reached were these simple editorial corrections?
  8. How many issues were taken offline for discussion?
  9. How many of the offline issues were subsequently brought back before the BRM for a full discussion and subsequently reached consensus?
  10. In Euros and in your local currency, what is the total cost for your delegation to come and work in Geneva for the week?
  11. In Euros and in your local currency, what is the total cost divided by the number of opportunities you had to begin full discussion on a new comment (that is, cost per national body comment discussion opportunity)?
  12. How much time was spent discussing procedural issues and what percentage of the total BRM time was this?
  13. How much time was spent discussing how to vote on unaddressed comments and what percentage of the total BRM time was this?
  14. How many Microsoft Office-specific features remain in the OOXML specification?
  15. After this experience, how many comments on a proposed JTC1 Fast Track standard would you say is too many to fully discuss and reach consensus on in a five day BRM?
  16. How long would the BRM have had to have been to fully discuss and reach consensus on all comments?

If you have attended other BRMs, how would you compare the OOXML BRM metrics with similar metrics for the other BRMs?

I’ll leave it to you to interpret the importance of any of these metrics after we get some numbers to plug into them. Presumably we will get a lot of the numbers to use because this is supposed to be an open and transparent process. At that point you can judge the relative success or failure of the BRM in its mission to review and reach consensus on resolutions to the comments offered by the national standards bodies.

What metrics would you use?

Incidentally, regarding the apparent discrepancy in interpretation of the September ballot numbers mentioned at the beginning of this entry, see the eWeek Microsoft Watch article by Joe Wilcox called “Microsoft FUD Watch: OOXML Edition.” It’s worth a read.

Also see: ‘An “OOXML is a bad idea” blog entry compendium’


This entry was posted in Document Formats, Standards and tagged , , , , , , . Bookmark the permalink.

9 Responses to Some metrics for measuring the success of the OOXML BRM

  1. Felix says:

    I suggest you clarify what you mean by consensus in the above metrics. (i.e. Does the proposal have to be accepted in order for ‘consensus’ to have been reached?)

  2. Bob Sutor says:

    Felix: Thank you, I tried to reword it to make that clearer.

  3. More importantly, as a question to be asked, why was there no public access to a process involving a decision that affects us all? We should take pride in the process, should we not?

  4. Stephen says:

    How about # of IBM employees not in a BRM delegation present in Geneva for the whole week Bob?
    Have a safe trip home :-)

  5. Bob Sutor says:

    Stephen: Nice to see you again as well! :-)
    Good travels to you and all your colleagues on your way home.

  6. @Stephen, there are 313 IBM employees based in the Geneva offices. Of course, most of them couldn’t have cared less about the BRM. There were lots of Microsoft employees in Geneva over the week too, since you have an office there.

    Really, what a ridiculous question.

  7. Alan Bell says:

    @Stephen
    or how about how many Microsoft employees such as yourself, Nick Tsilas, the Microsoft Malaysia lackey etc etc were cruising around Geneva with no particular place to go? Or how about we exclude the IBMers and Microsofties and look at the rest of the collection of hangers-on (Yes I am in that number). So how many non-Microsoft folks who were not in a delegation were taking a pro-OOXML position? Any? There were plenty of non-IBM folks arguing for the demise of the substandard standard.

  8. Stephen says:

    @Alan & Nathan, with respect, you guys might work on your sense of humour. It was a tongue in cheek comment (as was Bob’s response).

    As you ask though, I was in Geneva on Tuesday at the anti-BRM as OFE’s guest, and for a lunch conversation on Thursday afternoon. That was it. I wasn’t in Geneva over the weekend, or on Monday, or on Friday.

    Nick also attended the anti-BRM event on Tuesday and Wednesday. I think we were the only MS attendees.

    Your use of the word lackey (1. a servant. 2 a servile or obsequious person.) is a bit unnecessary, and I’m not sure who you mean I’m afraid.

    “There were plenty of non-IBM folks arguing for the demise of the substandard standard”. Isn’t that exactly the point? OFE organised an event specifically to target the BRM. There was no secret made of that, in fact Graham Taylor boasted about it. Plus the anti-OpenXML crowd apparently hung around after the OFE event closed for sectret meetings.

    I was amused when, on Thursday, there was a huge ECIS meeting. It turned out to be this ECIS though and not the EU commision-er.

  9. Bob Sutor says:

    Ok, we’ll end this thread here, I think we’ve heard both sides.

    That said, characterizing the OFE event as anti-BRM was a misrepresentation, Stephen, since many other folks from the Geneva area attended. Minimally, it cut down on the food bill for the BRM delegates in what was otherwise an extraordinarily expensive week.

Comments are closed.