locked
Running Code? RRS feed

  • General discussion

  •  

    Is “Running Code” necessary in the standardization process?  Empirically the answer is yes.  In the IETF a standards track RFC cannot achieve Draft Standard status without demonstration of at least two interoperable implementations developed from independent code bases.  For digital media CODEC standards,  reference decoders, encoders and bit streams are generally a required component of the standardization process.

     

    In other cases, such as USB, PCI and WiFi the standards were developed from near production-ready technologies but not necessarily from independent, interoperable implementations.   Cases of standards developed, or are being developed , from commercially available products include ISO/IEC DIS 32000 (Adobe PDF 1.1), ISO/IEC 26300 (OpenDcument Format),ECMA 376 ( Office Open XML), JPEG XR (HD Photo) and XPS.  In these cases the Running Code consisted of pre-production  silicon, many PDF implementations, OpenOffice, Microsoft Office and Microsoft Windows.

     

    Personally I can’t recall an industry standard that didn’t have some level of implementation behind it at the beginning.  Even in the cases of standards, like JPEG, that are purported to have started with a clean sheet of paper  I’ll bet the technical contributions had at least been developed and implemented at the research level.

     

    Perhaps the real question is whether standards organizations should rely on the IETF definition of Running Code or whether other forms of demonstrable or theoretical interoperability are acceptable.

     

    What do you think?

     

    John Calhoon

    Microsoft

    • Changed type Chris Mullaney Tuesday, October 14, 2008 10:34 PM this is a discussion, not a question.
    Wednesday, March 26, 2008 8:48 PM

All replies

  •  John Calhoon wrote:

     

    Is “Running Code” necessary in the standardization process? 

    [ ... ]

    Perhaps the real question is whether standards organizations should rely on the IETF definition of Running Code or whether other forms of demonstrable or theoretical interoperability are acceptable.

     

    I'm going to chew on this a bit.  My basic view is that working code is really important for software-dependent standards.  A reference implementation would also be good.  But I don't think that there is a pat answer.

     

    Now for the longer version.

     

     - Dennis

     


     

    The longer version:

     

    We need to remember that there are standards that are not about computation although computing might be involved.  I'm thinking of the ISO 9000 series, the IEEE (and ISO/IEC) specifications on software engineering, and so on.  The lower-level Ethernet layers and signaling standards required interoperability, I'm sure.

     

    My experience with programming-language standards is that there are usually implementations at the beginning, and convergence on a standard specification was wanted (e.g., Fortran), or a new solution was brought into existence to resolve a confusion of disparate initiatives (the actual origin of Cobol).  ASCII didn't have an implementation, although I suspect that the teletypewriter folk were ready and itching to go. I came in later than that.

     

    Something different happens in standards maintenance though, where the revisions of the standards may lead the percolation of implementations into use. 

     

    I personally favor the IETF approach, and it makes sense because the IETF attention is aimed at interoperability of protocols and the formats that go with them. 

     

    However, other standards-promulgating authorities seem to have their own ideas. 

     

    For example, there has to be interoperability at IETF to even advance from draft proposed standard to proposed standard.  And becoming a draft proposed standard takes a great deal of public work and demonstration of consensus.  I don't know if anyone has to have built it, but one would hope for some eager beavers to be at work.To eventually advance to IETF standard often takes years and is really recognition that the specified protocol or other element is a standard in practice and usage, in the everyday reality. 

     

    At the other extreme OASIS calls things standard when the specification first becomes official and available to adopt.  The W3C just maintains specifications and don't attempt to define "standard" as I recall.  I don't know that anyone had to demonstrate a processor for XML before XML 1.0.  I know the proposal for SOAP was implemented and the agreed W3C specification was not implemented at first.  (I am being lazy about looking up the history there, so don't shoot me.)

     

    I do note that neither ODF nor OOXML would have passed the IETF test at this point.  The implementations have to be complete and independently-achieved -- on different code bases, in particular -- and features that aren't interoperabily-implemented have a hard time surviving to the next stage.  I know back in the days of OSI development there were specifications made up by 2-3 people and having no implementation in sight.  I would hope those days are behind us.

    I Just notices something very strange, speaking of standards and interoperability.  I dove into HtmlView to select smaller text and also to draw the horizontal line.  What I noticed is that the HTML tags are spelled in CAPITAL LETTERS.  I did some view source on posts here and that doesn't seem to be how things end up, but it threw me.  (The actual posts are in some sort of pseudo HTML with no DOCTYPE.)  I only mention this because there is then the problem of staying interoperable in practice and the MSDN Forum implementation definitely keeps me guessing.
    Friday, April 11, 2008 11:19 PM
  •  orcmid wrote:
     John Calhoon wrote:

     

    Is “Running Code” necessary in the standardization process? 

    [ ... ]

    Perhaps the real question is whether standards organizations should rely on the IETF definition of Running Code or whether other forms of demonstrable or theoretical interoperability are acceptable.

     

    I'm going to chew on this a bit.  My basic view is that working code is really important for software-dependent standards.  A reference implementation would also be good.  But I don't think that there is a pat answer.

     

    I want to back up a little and consider the name of this MSDN Forum: Achieving Interoperability through Standards.

     

    I think that standards for protocols and formats are essential to achievement of interoperability.  The different players need a common basis for agreement on how things are supposed to work.  It is the point of reference for demonstrating and debugging attempts at interoperability.  Those efforts will also debug the specification of the standard.  It is a bit iterative, and a bit chicken-and-egg.

     

    Whether a standards-development process should require confirmed, interoperable implementations or not depends on circumstances.  I think it should be done more than it is, especially once a standard is undergoing maintenance and extensions.   But that should be worked out in the chartering of the standards-development process, it seems to me.

    Saturday, April 12, 2008 12:16 AM
  • John, I woke up in the middle of the night realizing that my post yesterday is a supporting argument for running code.  It is not strictly about standardization process, but about protocol specifications.   I can certainly see how formal standardization of a protocol would want to require demonstration of independent, running implementations though.
    http://orcmid.com/blog
    Thursday, July 10, 2008 5:15 PM