By: Ivo Ivanov
Hype is easy, substance is harder. That’s the harsh reality bearing down on us as we rush to capitalize on the endless possibilities of edge computing. The Gartner Hype Cycle—which tracks innovations, our expectations, and our ability to deliver on them—places edge computing in the “Peak of Inflated Expectations” phase. Next will come the “Trough of Disillusionment” before we embark on the coveted “Slope of Enlightenment,” where the technology will come of age, and we can reach optimum productivity.
Our expectations around edge computing are certainly inflated, but they’re not misguided. We simply need to overcome the technical hurdles and physical security issues facing the industry, to allow for a more rapid and seamless transfer of data between the edge and the countless enterprises, network operators, and data centers that make up our global data network. These are the three “big players” when it comes to moving data, and the success of edge computing rests on their ability to interconnect and hone their data streams in a way that is fast, frictionless, and secure.
Network operators, enterprises, and data centers represent the “digital interconnection triangle”—the heart, hands, and brain of edge innovation. The heart is 5G connectivity, the hands are the countless IoT devices that are emerging, and the brain is artificial intelligence—all of which will have unique computing requirements at the edge if we are to realize the technology’s true potential. In this article, we’ll look at these requirements in more detail, as well as at the bandwidth and data stream challenges that the big players will need to overcome in order to innovate at the edge. First, let’s examine where edge computing is today and what use cases are being adopted or pursued.
At its core, edge computing is a distributed information technology architecture that puts data processing, analysis, and even intelligence as close as possible to the endpoints that gather data and use it to make decisions. Processing data at the edge can offer incredible benefits, from improved performance and real-time analysis to better data security and cost-effectiveness. Consumer-facing industries, for instance, are pursuing edge computing so that they can offer a more personalized customer experience by gathering and analyzing endpoint data in real-time. Healthcare facilities are becoming more reliant on edge computing to provide in-hospital patient monitoring instead of relying on third-party clouds to process data. Edge computing is also a critical component in smart city infrastructure, allowing sensors and IoT devices to monitor energy use, manage traffic, and even carry out predictive maintenance.
Life on the edge is good. It’s fast, it’s streamlined, and it’s necessary if we are to continue down this path of real-time analytics, automation, and rapid data processing. But first, we must lay the path.
If our edge computing goals are to be realized across myriad industries, we need a more efficient way of interconnecting players within the connectivity ecosystem. The digital interconnection triangle of 5G, IoT, and AI is a disruptive force, but they need to be managed effectively. As enterprises start to take advantage of these technologies, they will need to adopt new interconnection service regimes that are customized for their particular needs—there is no one-size-fits-all approach here.
Before we look at how these new interconnection services might work, let’s first break down the interconnection triangle and address its needs.
5G, and eventually 6G, is the heart of innovation. 5G is capable of handling a wide range of frequencies and the transmission of multiple data streams at once. It was designed to handle data from a large number of sensors and other endpoints and has become one of the foundational pillars of IoT. It offers reduced latency and better application response times and makes it easier for businesses to collect and process data securely.
The Internet of Things (IoT) is the hand of innovation. The number of IoT devices around the world is expected to
triple in the next decade, from 9.7 billion in 2020 to almost 30 billion by 2030. These are the “hands” gathering the data, from sensors and smart cameras to uCPE equipment, servers, and
processors. Some of these devices will reside on business premises, while some will reside in edge computing data