Date: July 6, 2011
What role—if any–should the IETF play in the development of application protocols? That was the hot-button issue debated by expert panelists at the Internet Architecture Board’s technical plenary session on March 28 in Prague, Czech Republic.
The panellists noted that today’s Internet application developers tend to favour an open source approach, such as publishing their own application programming interfaces (APIs), rather than participating in standards bodies such as the IETF. The reason developers are doing this is because it dramatically shortens their time to market versus going through the traditional standards development process.
“These guys have achieved really fast scale for interprovider messaging without standards,” says Jonathan Rosenberg, chief technology strategist at Skype. “It’s interesting because it’s all about a new model for delivery of apps to users…. Nowhere during this process did anybody need to show up at an IETF meeting and ask for a standard…. All the intermediaries are cut out and the dependencies are gone, and that’s what has brought life and innovation to these apps.”
Several panelists promoted the idea of the IETF getting involved in new standards-development work that could underpin these APIs and provide basic, interoperable functionality to next-generation browsers.
“The consequence of the browser as a delivery platform is that there are an enormous amount of APIs required,” said Henry Thompson, who serves as liaison between the IETF and the W3C. “With so many new APIs…what happens to the One Web for All goal?”
Thompson kicked off the panel with an overview of activities at the W3C’s Technical Architecture Group related to future Web architecture. Thompson said that both the IETF and the W3C were facing some of the same difficulties in engaging participants in application-oriented work, and he recommended that they cooperate more in the future.
“The fact that IETF and the W3C have proceeded in relatively amicable parallel tracks for many years without regular exchange of people is, in hindsight, unfortunate,” Henry said. “Having a little more first person interaction between the two groups is good for all of us.”
Henry said W3C is in the process of reconsidering its Web architecture documents in light of the fact that the Web is no longer a collection of documents, but instead has evolved into a collection of documents, data and applications.
“Tensions have arrived because HTTP, HTML and browsers were not designed to deliver applications, they were designed to share documents,” Henry said. “Our new Open Web Platform is a platform for innovation, consolidation and cost efficiency…. These new Web architecture documents are pretty much squarely in W3C territory, but many of the more recent concerns have drawn us into IAB territory. Clearly, it’s time to give some thought to demarcation.”
Harald Alvestrand, a former IETF chair, urged the group’s participants to continue to be involved in the development of application-oriented standards with an overriding goal of making the Internet work better. He promoted the idea of Real Time Communications Web (RTC Web), a proposed working group that would create a set of specifications that would be useful for the interoperability of future browsers.
“We should have a uniform interface in the browser so you can send media from one browser to another without all this plug-in stuff,” Alvestrand said. “You should be able to have compatibility between browsers. If I use Firefox, and you use Opera, it doesn’t matter; we are compatible. But in order to actually work, it has to be matched with uniform APIs inside the browsers so the downloadable applications can run anywhere naturally.”
Alvestrand said that having uniform APIs will help address the fact that end users have different kinds of platforms—with varying screen sizes, microphones and cameras – and that applications need to detect those differences and behave accordingly. “This makes interfaces complicated unless we are really good at designing them,” he said.
Having a standard real-time communications platform for browsers would foster innovation because it is so expensive to build a proprietary one, Alvestrand argued.
“If the interfaces are standard and universally deployed, anyone can write a video-using application. It’s cheap,” he said. “If you have standard APIs and functionality availability, you can just try something, put it out there, and see if anyone uses it.”
In summation, Harald said that the Internet’s new applet paradigm requires more standards, not fewer standards.
“The IETF needs to lose its fear of the type of protocol called an API,” he said. “The IETF needs to take responsibility for making sure the whole thing works up to a level where it can actually be accessed. The IETF needs to work together with other organizations so the appropriate wisdom…is made available to the right people at the right time.”
Jonathan, one of the lead authors of the IETF’s Session Initiation Protocol (SIP), discussed the successes and failures of this VOIP protocol as an example of how the group’s standards are getting adopted—or not adopted—in the application space. He pointed out that the IETF began work on SIP in 1999 and has published more than 100 technical specifications related to SIP in the intervening nine years.
“By anybody’s metrics, [SIP] has been a heavily successful protocol,” Rosenberg says. “It has been implemented in hundreds, if not thousands, of products. It is making people money, and it is making people happy…The protocol met the needs of an industry at the time.”
However, Jonathan points out that SIP has failed at providing functionality beyond that offered by the Public Switched Telephone Network, doing little to deliver new features. He attributes this failure to the economics of telecommunications carriers and how it takes them years, if not decades, to introduce new functionality.
“If you’re a service provider that wants to roll something out, you have to get enough community interest to get something massively marketable enough to get it through, which means it’s probably pretty vanilla technology,” he said. “Something weird and unusual is not going to make through the curve. That means it’s pretty difficult to roll out and deploy innovative new stuff.”
Jonathan noted that if end users wanted innovative features with interoperability, then the market would meet that need. Instead, service providers are offering simple, commodity features on slower timelines, and they’re putting more energy into developing proprietary features, which are quicker to develop and offer differentiation.
One trend “is the popularity of service providers publishing REST APIs as a new form of inter-domain interoperability also without standardization,” Rosenberg says. “The model there is a service provider…writes some code that sits on a server somewhere, they write some code that gets sent to a client through an app store or client. They deploy it, and then they publish it. Then they are done.”
Jonathan added that these service providers “are innovative, they have been moving fast, and they have dramatically shortened the time to market. These guys produce software and deploy services in weeks or months, as opposed to years or decades following the telecom innovation cycle.”
In the past, service providers waited until standards were set before deploying new applications. Today, “the need for having inter-provider standards is gone,” Rosenberg concluded. “Standards are moving from being first, to being last, if ever.”
Leslie Daigle, chief Internet technology officer at the Internet Society, wrapped up the panel with a recommendation that the IETF focus on creating standards for the building blocks that developers can use to create innovative applications and services. As an example, she pointed to HTTP, which is a protocol for linking and sharing content that is serving as a key building block for many of today’s proprietary applications.
“We have a bad track record of predicting the future of the Internet,” Leslie said. “That is why we specify building blocks and not buildings.”