locked
Interoperability: Do We Need To See Code? RRS feed

  • General discussion

  •  

    An article by Dino Chiesa has led me to look deeper at the ways that development of open protocols might be similar to open-source activities, even though open protocols for interoperability ideally do not require knowledge of anyone's code.  It is more valuable, I say, to look at how the specifications and the verification of implementations are reconciled over time.  It isn't always pretty, but maybe there is a place where the many-eyes ideas about improvement of open-source software apply to open specifications.

      1. Interoperability and Open-Source Development Are Different 
      2. The Advantage of Not Seeing the Code 
      3. Open Protocols and Community Engagement 
      4. What Was the Question, Again?

    1. Interoperability and Open-Source Development Are Different

    Dino Chiesa has a provocative blog post about Open Source and Interoperability (via Enzo De Lorenzi), arguing for a separation of open source distribution (a development and licensing approach) and interoperability (the ability to connect and operate systems and components together to accomplish some purpose):

    "I am not making quality judgments on either open source or interop.  I am not saying that one of them is good and one is bad.  I am saying they are two different things.  ... They are related, they are neighbors, they are acquaintances, but they are not interchangeable.  Nor does one imply the other.  

    "Repeat after me:

    1. Interop is not open source.
    2. Interop does not require open source implementations.
    3. Open source does not guarantee Interop."

    I'm already convinced of that.  What I am more taken with in Chiesa's post is his analysis of what is important for interoperability.  I also think that practices associated with open-source can have a role there.  Let's take a look.

    2. The Advantage of Not Seeing the Code

    First, Chiesa concedes that re-use of code from different sources may be aided by seeing the other code.   That is a minor interoperability situation, however useful for understanding and repurposing of other code.

    There are greater benefits in not having the code to look at.  Chiesa puts it this way:

    "[For the] challenge that architects and devs confront when they use the word 'interop', looking at  source code is not helpful, and I won't hesitate to argue, I think it would be counterproductive.   Yeah, you read me right - it actually is harmful to look at the code if you want to connect two big apps together.

    "What is necessary to enable interop in these cases is protocols, people.  ... This is why a Java or .NET app can connect to an IBM transaction processing system, even though the on-the-wire protocols are completely closed and proprietary to IBM.  The protocols are documented.    They are closed yet published.  And because IBM's DTP protocols are published (not publicly per se, but published to those who license the protocols), anyone can implement the client-side of the exchange. [all emphasis in the original]"

    Protocol specifications describe the protocol data elements, interfaces and the essential behavior of the parties, independent of exactly how a particular software implementation accomplishes its rôle under the protocol.   The liability of starting from code is that behavior is buried in accidental and inessential details that obscure recognition of precisely and solely what must happen for the parties to interoperate.  

    It is particularly easy to demonstrate the problem of working from code if the parties are unable to use the same programming-language system and platform for their implementations.  Then it is necessary to reverse-engineer the actual protocol out of the code, freeing it from incidental, implementation-specific baggage that could be a terrible drag if simulated unnecessarily in the second implementation. 

    Having the behavior and data units be well-specified provides a superior basis for interoperability.  It is also advantageous for the future maintenance and portability of the initial implementation, even when first given birth on a single-platform product of a single producer.

    Being an adherent to this point of view, you can imagine my surprise on learning that Microsoft began to make source code available to licensees of its Open Specifications (those under the MCPP program at the time) in early 2006.  This was understandably expedient considering the difficulty of deriving specifications (with whatever reverse-engineering and confirmation testing that might entail) and of developing prototype implementations by an independent technical committee at that time.  The lesson: arrange matters in the future so that inspection of the code becomes unnecessary to knowing the protocol essentials.

    Dino Chiesa's example includes the prospective licensing of documented but proprietary protocols.  Although that is certainly one case of interoperability arrangements, I want to separate out the licensing of closed protocols and consider the degree to which there is harmony of open-source development practices and the development of open protocol specifications, with or without licensing of intellectual property.  The difference is that it involves an open approach to the specifications, independent of whether there are open-source implementation efforts.

    3. Open Protocols and Community Engagement

    More from Chiesa:

    "Standardized protocols are essential if you want broad interoperability ... .  I said that standards are not required, and I stand by that statement.  But practically speaking, standards are almost a sine-qua-non of meaningful interop.   

    "Web Services and XML are just common protocols ... .  It is not required that web service endpoints, either client or server, be implemented with an open-source web services stack, in order to get good interoperability.  Instead it is essential that the endpoints conform to the standard protocol definitions.  And the corollary is, the protocol definitions must be sufficiently clear, complete, relatively simple to implement, and relatively simple to test, such that faithful implementations of the protocols can be validated easily and will interconnect transparently. [my emphasis -- dh:]"

    Protocol specifications in which one or more parties have invested significant interoperable implementations are indeed a form of standard, even when entirely held in private arrangements.  The specifications are standards because they provide authoritative statements of what is essential to achieve and preserve to accomplish implementation interoperability.  Specifications provide the measure.  

    Specifications are also subject to versioning and consideration of ways that implementations will manage to interoperate appropriately with implementations developed to older and to newer versions of the specification.  The tension between preservation of value and expanded utility is part of the evolution of interoperability-oriented specifications.

    How do protocol specifications provide standards in this sense?

    In practice, the refinement and testing of the specification itself happens when there are multiple implementations and their interoperability is tested and confirmed.  Efforts to implement the protocol (special case: a format for interchange of information) will lead to questions of interpretation and ambiguity as the specification is studied by implementers.  Arrangements for laboratory verification of actual interoperation, perhaps with test suites, will reveal misunderstandings, bugs in implementations and bugs in the specification.

    Once a specification is in widespread use, there are inevitably matters of imperfection:

    1. Inaccuracies and defects in the specification
    2. Deviations from the specification by different implementations
    3. Treatment of changes in the specification with regard to implementations that are already at large in the world
    4. Anticipation of how future changes in the specification will provide for implementations against the current specification
    5. Support by producers for deployed implementations that are out of specification in some way
    6. Reconciliation of existing products with the evolving specification
    7. Provision for past versions of the specification and their implementations in development of a new version

    The opportunity for open community involvement in what is held as an open protocol can be in accelerating the maturation and stabilization of a specification by the attraction of many eyes and interested parties.  Efforts to comprehend and apply the specification in interoperability cases are invaluable.

    Another opportunity, given that the specification is open enough for such use, is creation of a reference implementation and samples that (once themselves stabilized) do serve as a way to test implementations for essential functionality.   It is appealing to use open-source development in this case.

    4. What Was the Question, Again?

    Have we come full circle?  That depends.  An open-source reference implementation need not be product-worthy.   It may be designed to operate in a very straightforward way with no optimizations and usability considerations.  Samples are just samples and kept simple for understandability.   And there is no need to disclose production-implementation code.   (I am also neglecting intellectual-property considerations that may limit the degrees of freedom available for unconstrained community contribution.)

    The reality is that specifications do not have value without implementations that establish their credibility and utility in achieving interoperability.  And specifications themselves are tested and refined as the result of implementation effort.  There is a cycle of learning and improvement between specifications, implementations, and experience.  The challenge is to remove friction and accelerate that process.

    Working to make an open specification and have community involvement in perfecting the specification may be crucial to fostering take-up of an interoperability opportunity.  Open-source development practices are a low-friction way to invite community contribution and deliver a mutual benefit.  Having the specification be open for public use, feedback, and discussion and safe for some kind of implementation (depending on license conditions) is a valuable way for a community to find and organize itself to collaborate on interoperability.

    This is something to consider when designing for interoperability and when assessing the level at which interoperability is invited.  Perhaps the most important consideration is that fostering and sustaining of interoperability is a journey, not a destination.  It won't look perfect.


    http://orcmid.com/blog
    • Edited by orcmid Thursday, July 10, 2008 5:10 PM Repaired a very difficult sentence at the beginning
    Wednesday, July 9, 2008 10:59 PM