SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Road to Open Edge Computing


...the variety of building blocks that can be combined in numerous ways to fit an endless number of scenarios. Your AI and machine learning application can control robots on a factory floor or the life cycle of shrimp on a farm while being connected to a cloud infrastructure either locally or remotely.

Architecture options
and where to find them

The evolution of cloud computing doesn’t take cloud out of the picture; it only restructures it, on certain occasions. In most cases, you will find that edge computing extends the already available cloud data center to bring more functionality to the end users. The environmental circumstances, unfortunately, will not change by this new method of organizing the available resources. You will still need to fight unreliable networks and will have workloads running in rural areas, which are sometimes hard and always expensive to access, but we still need to minimize latency, maximize bandwidth and take care of the lifecycle management of both infrastructure and applications.

The questions come naturally: How does one overcome all these challenges? And Is there an ultimate solution out there?

No matter how hard you try to find it, there is no one-size-fits-all solution that would satisfy every use case. The simple reason why comes from the aforementioned story about your edge not being unique but just different enough to make next steps a little more complicated than it first seems.

You can probably find components for your edge use case on the shelf of every vendor by now, whether they sell software or hardware packages or both. The new terms brought new business to the table along with the next questions: Which pieces do we choose and how do we fit them all together?

While edge computing is in its early stages, it already has a significant footprint in the open source ecosystem. Collaboration is crucial in this area because of the variety of building blocks that can be combined in numerous ways to fit an endless number of scenarios. Your AI and machine learning application can control robots on a factory floor or the life cycle of shrimp on a farm while being connected to a cloud infrastructure either locally or remotely.

And when we talk about connection, you immediately realize that the forever unsolved challenge of interoperability is under the magnifying glass here. Along with standardization, open source software development methods are most efficient in overcoming this obstacle by having APIs and interfaces openly defined and available to examine and test. The various groups can also work together to integrate many pieces of the infrastructure before any of it would hit your labs for a trial.

Groups such as the OSF Edge Computing Group are looking into use cases and requirements to find commonalities in the requirements and build architecture models that help development communities to evolve and test the pieces you will need to build your infrastructure on the available footprint you have.


Figure 2: Model comparison
(click to enlarge)

Current architecture models all build on cloud data centers which are connected to edge sites of different sizes and varying capabilities. This often results in a spider-web-like graph. The scale of these webs can grow extremely large, especially in telecom use cases where you need to balance out how much functionality you put on the different dots and what it is going to cost you when you need to manage and maintain it. These observations have recently led to common error cases that may have specific behavioral needs that the infrastructure has to provide, which also give you hints about certain decisions. The most common scenario is losing connection between an edge data center and the central cloud. Depending on your use case, the autonomy you need in your edge sites differs.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel