By: Bernd Pruessing
Without data integrity, any undertaking that relies on the data in question is doomed to fail. Take a digital twin, for example. For communication service providers (CSPs), which have a highly complex hybrid network infrastructure that spans the data center, network, and the customer edge, a digital twin is a valuable tool for designing, planning, and operating the network. As a virtual replica of the infrastructure, a digital twin clones all physical assets, networks, and processes. This information can be used to run simulations to understand the impact of changes before they are made and to run what/if analyses for the better operation of the network. Such pairing of the virtual and physical worlds within a live model allows operators to analyze data and monitor systems to optimize space, capacity and connectivity management and to mitigate problems before they even occur. If the data in the digital twin is bad, so too will be the outcomes of decisions based on it.
To successfully maintain a digital twin, or any inventory, reconciliation of data is required. Automation of the reconciliation process is ideal to resolve any differences between the actual network and its representation. Otherwise, discrepancies must be investigated and resolved manually, which takes more time, costs more, and is subject to human error.
A seamless and automated alignment with the network requires the implementation of adapters to the network devices or the network management system to collect the data, which may include hardware, logical and virtual resources such as connectivity, cells, or software applications. To implement adapters that are flexible enough to adapt to new resources in the network in a fast and cost-efficient manner, the right integration framework is crucial.
Before we can talk about integration framework, we need to address reconciliation. An inventory project’s success depends on how well data discrepancies can be found and fixed. Open architecture plays an important role in this area and has become a best practice in today’s complex hybrid environments. A truly open architecture will prevent vendor lock-in. This matters, because CSPs need the ability to extend their inventory management solution themselves or by a preferred integrator.
The first step for any inventory management project is the consolidation of available data. This involves the initial migration of existing data sources to the system. Ultimately, the process of uploading data and the alignment processes between the existing network and other systems for daily operations should be automated. Once the processes are automated, accurate data is available to feed rollout, planning and tracking processes, as well as coordinate the IT resources and services that make up the network. The success of such a project is entirely dependent on how efficiently the upload can be implemented. Data upload does not just mean documentation of data. It means merging, analyzing and fixing discrepancies between data from NMS southbound, east-west interworking with other databases and integration northbound with OSS/BSS. In other words—reconciliation.
Keep in mind that merging, analyzing, and fixing discrepancies between data from network devices and OSS/BSS can be cumbersome. To streamline reconciliation workflows, CSPs should utilize a unified resource model to align and automate network processes. Analysis and graphical representation systems, such as BI or GIS applications, can also help CSPs get the most benefit out of the database.
Data integration is the most costly and complex aspect of implementing a new inventory management solution. The reason is that, in today’s hybrid world, physical hardware must be managed in harmony with logical and virtual resources. This data comes from many sources and vendors. Some of them have many limitations in their integration capabilities or specific behaviors. Migrating existing data or uploading data from the network, the data center and other domains involves three important considerations.