By: David Ward
Date: March 6, 2015
There exists an unavoidable question for both community participants and observers: are standards development organisations (SDOs), such as the Internet Engineering Task Force (IETF), still relevant in today’s rapidly expanding environment of Open Source Software (OSS) projects?
For those new to the conversation, the question is not whether SDOs should exist—they are a political reality inexorably tied to trade policies and international relationships. The fundamental reason behind their existence is to avoid a communications Tower of Babel and to establish governance over the use of a global commercial and information infrastructure. The question is whether these organizations have a role in enabling innovation.
SDOs, such as the IETF, must evolve their processes in order to keep up with the technological landscape and for its development processes to remain relevant.
Software has come to dominate what we perceive asthe Internet, and the agile development model has created a sharp knee in the rate of innovation over the past couple of years—innovation that needs standardization. Although code is “coin of the realm” in Open Source Software (OSS) projects, code is not normative.
It is important to have SDOs and consensus-based standards. But SDOs need to realize that the OSS cycle time can create a market-based consensus to fill a standards void and that this realization may be the key to our collective futures.
There is an impedance mismatch between SDOs and OSS projects of at least 2:1 (two years to a paper standard versus one year to a product that creates a de-facto standard).
Globally, many SDOs appear incapable of defining and maintaining their boundaries, and new technology study groups are exploding across them. Every organisation is potentially—and dangerously—self-perpetuating. Few SDOs have a life-cycle plan that bounds their authority and scope as applied to new technologies.
Real coordination between SDOs is not readily detectable. This dilutes the efforts and resources of participating companies and individuals, and is creating confusion for the consumers of these technologies.
Within the IETF, we face numerous issues around our own life cycle. How much of our time are we spending on further standardization of established technology at the expense of more-pertinent and relevant working groups? How do we handle issues, technologies, and new architectures that would span our existing structure when they arise (e.g., the recent YANG model explosion across working groups)? What does the subject matter of popular, network-centric OSS projects imply might be missing at the IETF?
Most important, how do we offer startup companies, new vendors, and newly invested consumers the assurance that they have a voice, while avoiding the appearance of being an aristocracy and not a meritocracy-driven body?
Conway’s Law: Organisations that design systems are constrained to produce designs that are copies of the communication structures of these organisations.
To an outsider (and even some insiders), the recent reorganizing of the workgroups has the appearance of shuffling the deck chairs. It doesn’t change our process. Conway’s Law applies here.
Without more fundamental structural change, we can only expect more of the same process. The world shouldn’t wait two years for a standard for Service Function Chaining, or even more years for Network Virtualization Overlays (or Network Functions Virtualization in general, which is more of a European Telecommunications Standards Institute problem).
While there is much to say regarding the challenges that both the global SDO community and the IETF face, there also are potential risks were the OSS communities to run away with the standardization mantle.
In short, the danger is the coopting of open source due to the lack of governance. Open source software projects with poor governance risk multiple, equally bad fates.
Like the confusion stemming from the uncontrolled overlap of standards from multiple SDOs, OSS projects that overlap can also create confusion. Competition can be both unintentional (e.g., a difference in technical opinions) and purposeful (e.g., vendor freeware offered as open source with no real community diversity to offer support alternatives, complimentary products, or hooks to other projects). The result can be multiple small communities that are underfunded or understaffed monocultures dominated by a single party.
Good and impartial third-party governance helps avoid the creation of overlapping, nondiverse, and confusing projects.
OSS projects that don’t connect to form larger architectures can create fragmentation. Fragmentation results when multiple projects each deliver part of an overall solution but cannot be used together, thereby frustrating any progress and interfering with higher-level innovation.
Good governance creates a community that considers both the upstream and downstream connectivity of a project.
Security flaws can result when the project has a weak security focus, often the result of critical technology with too few reviewers and maintainers. This result recently manifested in OpenSSL (HeartBleed), and is now being addressed through the Linux Foundation Core Infrastructure Initiative (for OpenSSL, OpenSSH, and NTPd).
Good governance establishes an effective development process—not only for new contributions, but also for maintenance, updates, and releases.
Proper governance also provides essential business, legal, management, and strategic processes that ensure a proper ownership and licensing of contributions, release management, and open community involvement. Excellent examples exist in the Linux Foundation, the Apache Foundation, and the OpenStack Foundation.
Alternative SDO Model
There have been several SDO proposals to subsume and standardize network-centric architectures developed in OSS via the endpoint/interface/ application program interface (API) definition exercise. The Open Networking Foundation (ONF) is an example of an early attempt at a different hybrid model: attempting to bridge both worlds, it used the word standard to describe its wire protocol and the wordopen to describe its architecture.
The protocol evolved through numerous and sometimes not-backward-compatible specifications, and the organisation moved very quickly into advocacy and market development for the protocol, architecture, and OpenFlow controller (the latter activity not normally associated with a traditional SDO).
Although open-source controllers and switches were available, most were developed outside the ONF by individual interest groups. The ONF provided no reference implementation of their own.
The most important lesson from the ONF experience is fundamental to both SDOs and OSS: a truly successful ecosystem and community is created via the openness of a solution framework and via collaboration, not via ownership, and only in that way can one avoid fragmentation and confusion. Unlike in an SDO, marketing has a place in OSS projects, but should be focused primarily on community building and engagement.
Why Open API and Framework Standards Are Important
Many of the emerging OSS projects provide broadly scoped and connected solution architectures. It’s important that we discuss the role of SDOs, such as the IETF, in making the connective-tissue of these new architectures normative, in order to ensure the functional interoperability that some fear may diminish in this environment (https://tools.ietf.org/html/draft-opsawg-operators-ietf-00, posted by members of the Internet Society).
The future standards in a software-driven network will be in the form of APIs and application/service frameworks. The same reasons that the underlying Internet protocols are standardized apply to these higher-level concepts: interoperability, choice, and system design.
Standardization is necessary to vanquish the myths that a future, which integrates a large amount of OSS, means a future in which all software and solutions are free, and that the only viable economic model for a vendor is solely to support OSS.
On the contrary, properly designed, open, and standardized frameworks, protocols, state machines, and the like enable vendors to provide intellectual property in a modular, and, if need be, replaceable manner. There will certainly be community-supported OSS components within developing solutions, but via standardization the incentives for innovation remain for established and startup vendors.
In this way, vendor support of OSS becomes rational and credible—as does its consumption in the operator community.
In spite of political or economic mandates for existence, the right of any SDO to be an authority must be earned. The IETF is arguably the most appropriately focused SDO to engage in standardizing the software-driven network. The IETF is neither too broad (e.g., not involved with health and safety or environment and climate change) nor too narrow (e.g., not a single service or network domain), and the IETF experience with architecture definition, protocol development, and information/data modeling (YANG) overlaps well with the interests and outputs of network-centric OSS projects.
How to Make the IETF Relevant in this Environment
To make the IETF the SDO authority for new things that IT professionals and operators need, I propose that the IETF do the following.
- Consider reforming and restructuring itself to facilitate a more agile process. Kill off what should be dead and make room for new work. Specifically, fail fast in order to succeed faster with fewer yet better ideas that move at the speed the market moves: more Birds of a Feather meetings leading to more successful, relevant working groups with shorter lifespans, less paper to wade through, and more tangible outputs. Enable new working groups to proceed with technical work in parallel with some of the Framework, Architecture, Requirements, and Use-Case drafts that have bogged down so many people for so long. Cut the cycle time for everything (rough consensus shouldn’t take two or more years).
- Emphasize software development more in the IETF structure. Encourage interoperability and function demonstrations all the time. Running code used to be part of the IETF mantra, but running code later is not agile. Think “hackathon” during the standards development process. From experience, the best standards have been produced hand in hand with writing the code at the same time.
- Engage in even more research (already a strength), thereby engaging a broader range of participants.
- Fix, change, or reinvent the liaison process because it will be critical to collaboration with OSS projects. In fact, don’t even use the liaison process as a model.
- Embrace Open Source projects. The heart of this effort will require the establishment of an open-loop engagement between the Internet Engineering Steering Group and reputable OSS foundations on productive and compatible projects. A good example of such a compatible and properly governed project is the OpenDaylight Project (Linux Foundation), which is driving the use of YANG models into the IETF as well as other open source projects.
By researching and monitoring OSS projects, we can actively invest our energy in emerging technologies instead of waiting for it to show up on our doorstep.
A feedback loop between parties will identify the areas of new and existing projects that need to be standardized, should use existing standards, or are out of compliance with standards. From a viewpoint of experience, writing code before standardizing has produced the most complete and simplistic definition of protocols.
Part of this process will require that the IETF resist the need to own or copy everything into IETF working groups. SDO geeks and OSS geeks are not the same. Paradoxically, for our purposes, code is not normative. But it’s also hard to define and standardize APIs if you’re not writing code. Forcing an integration of skills and purposes changes a community with potentially bad results—collaboration, interaction, and an exchange of ideas is a better model for all of us. (Note that collaboration will not work if the SDO cycle creates unnecessary drag on the OSS partner. Conversely, a nonvibrant OSS community won’t be able to interact with any SDO.)
Finally, we must (1) adapt and adopt new laws, and (2) avoid Conway’s Law.
Law of OpenSource: The quality and strength of a project is 100 percent dependent on the interests, energy, and capability of the developer community.
Law of OpenStandards: The importance, validity, and timeliness of relevant specification is 100 percent dependent on the interests, energy, and compromises of those who have been empowered to manage, organize, and complete the work effort of the SDO.
If we realize that these laws exist, we must also understand that the roles of OSS and SDOs need to change. We must set a new trajectory, move faster, and focus on building a bigger and better Internet.