SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Future of TOSCA and NFV


The “world according to NFV” was necessary when studying the NFV paradigm early on, but operators need those constraints removed in order to fully benefit from NFV

"Open Standards" facilitate interoperability and data exchange among different products or services and are intended for widespread adoption.

Where some differences of opinion may exist is in some of the details of the ITU-T definition that refer to Intellectual Property Rights (IPRs). For the purpose of this article, I will refer to “open standards” solely to those adhering to IPRs being licensed to all applicants on a worldwide, non-discriminatory basis for free. All others I consider “open-but-not-quite-free”.

Open Source

Open Source covers two related concepts regarding the way software is developed and licensed, and they are captured in the "Free and Open Source Software" and the "Open Source" definitions, and it is both about free access to, and free distribution of, code. While “code is king”, open source organizations also document their efforts in specifications, which by implication can also be considered “open standards”. If adopted, they become “de facto” standards, and may lead to changes in “de jure” standards.

This is in fact the healthiest way to produce a valuable standard: coopetition between SDOs and open source projects that result in a true “open standard” that is validated, adopted and finally formalized at some point.

The Dangers of Over-Standardization

With the focus on NFV management and orchestration, standardization typically brings OPEX reduction because of more replicable operations, a certain degree of vendor-independence for operators, the promise of interoperability between products from different vendors, and lower integration cost. At the same time, standardization reduces the potential for innovation and introduces barriers to operators intending to introduce services at the cutting edge of technology.

The most comprehensive set of standards for NFV management and orchestration comes from ETSI NFV, which has done an extraordinary job in a relatively short time by all accounts. In fact, I believe that ETSI NFV has done too good a job - by too early pushing for normative standards (Release 2 and Release 3), instead of allowing more time for the informative standards (Release 1) to be absorbed, put to the test, and evolve with industry feedback before mandating. Too much, too early may also mean not-quite-the-optimal-standardization needed at present for the still evolving NFV paradigm.

ETSI NFV standards have been developed with constraints such as a pure NFV-centric approach, avoidance of FCAPS management and reluctance to network function decomposition. The “world according to NFV” was necessary when studying the NFV paradigm early on, but operators need those constraints removed in order to fully benefit from NFV.

It is time for a pause to assess whether a large number of normative standards with many inter- dependencies (practically an all-or-nothing untested, restrictive set) is not premature, and whether a different approach that does not inhibit innovation and is more future proof is more appropriate.

Just imagine the effect on ETSI NFV standards if some of the following scenarios are adopted by operators in production, based on implementation/testing:

  • NFVO is absorbed into a higher layer Service Orchestrator;
  • Specific VNFM disappears, or is re-cast as a VNF component;
  • Generic VNFM is absorbed by the NFVO;
  • The notion of Network Service disappears, or is absorbed into a broader notion of Service; and
  • The VNF as we know it is further decomposed into granular network functions exposed to the management architecture.

Where and when standards are necessary is typically the operators call; vendors thrive on differentiation.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel