Article

Edge Computing vs. 5G: Are they the same thing?

A skeptical engineer separates hype from the facts

By

John Walicki

Telcos and Industry pundits continue to hype the coming era of 5G and edge computing as the technologies that will unleash the next wave of innovation. Consumers and developers are confused by the promise and ambiguity of how these technologies will influence society in the next era.

Are you excited about faster speeds and AI-infused edge use cases? I am too! But I'm also an electrical engineer turned embedded software developer unfazed by gushy headlines. Show me the spec sheet.

Questions about 5G and Edge Computing

This article will address some of the questions that developers might have about 5G and edge computing and how to position these technologies in real implementations.

Edge computing and 5G - Aren't they the same thing?

No! Although 5G and edge are sometimes used interchangeability, they really are different technologies and deliver different value for different consumer, retail, and industrial sectors.

Edge computing, at the core, is about the movement of workloads and models away from the cloud and closer to where the action is. Edge is about managing distributed AI models that can process data and distill insights by running predictive analytics close to source of the data. Next generation edge tools also facilitate the orchestration at scale and include autonomous management.

5G, on the other hand, is a communications protocol and technology set. For the most part, the value lies in new and improved methods of communication, which sometimes include faster, lower latency methods of communicating between devices.

But faster is better, right?

Yes, 5G promises lower latency, higher bandwidth, and network slicing. While 5G might take latency down from 9ms to 5ms as compared to 4G, in reality, that’s from the device to the cell tower. Currently, the end-to-end round trip latency from the device to the Cloud service is 400-500ms. 5G will really only be reducing latency by about 1%. Likewise, the core infrastructure network is still bottlenecked at the Cloud backend.

What we really want is smarter, not faster. That requires moving the machine learning AI prediction model workload closer to the person or process to avoid the backend bottleneck. We need edge computing to fulfill the smarter, not faster desire. Edge computing works with 5G, 4G, LTE, wire-line, satellite, disconnected for that matter -- it doesn’t really care.

If Edge Computing is the Next Big Thing, why did I become a certified Cloud Native programmer?

Distributed systems management of computing devices spread out across corporate environments is not a new technology. The computing industry has been doing that since the '90s Client/LAN server era.

What is new is what we've learned over the past decade about managing workloads at Internet scale using Cloud Native technologies, such as containerization and container management. When we bring cloud-native management techniques to the edge to manage and orchestrate those edge devices, we unlock vast value. The good news is that you can apply some of the techniques you have come to love - containerized workloads and Kubernetes - to manage your edge infrastructure.

Where is the edge?

The LFEdge State of the Edge project believes in several principles:

  1. The edge is a location, not a thing
  2. There are lots of edges, but the edge we care about today is the edge of the last mile network
  3. This edge has two sides: an network infrastructure edge and a on-premise device edge
  4. Compute will exist on both sides, working in coordination with the centralized cloud.

The Edge Glossary has some great definitions. Here are a few notable terms:

  • Device Edge. The device edge is edge computing capabilities on the device or user side of the last mile network. The device edge often depends on a gateway or similar device in the field to collect and process data from devices. It might also use limited spare compute and data storage capability from user devices such as smartphones, laptops, and sensors to process edge computing workloads. The device edge is distinct from the infrastructure edge because it uses device resources.

  • Device Edge Cloud. An extension of the edge cloud concept where certain workloads can be operated on resources available at the device edge. This edge typically does not provide cloud-like elastically-allocated resources, but it might be optimal for zero-latency workloads.

  • Infrastructure Edge. The infrastructure edge is edge computing capability, typically in the form of one or more edge data centers, which is deployed on the operator side of the last mile network. Compute, data storage, and network resources that are positioned at the infrastructure edge allow for cloud-like capabilities similar to those found in centralized data centers, such as the elastic allocation of resources, but with lower latency and lower data transport costs due to a higher degree of locality to user than with a centralized or regional data center.

How can I make my edge devices "smart"?

Industrial IoT devices are great at generating data, transmitting that data to the Cloud for analytics, and representing data as a digital twin. Data science techniques allow you to start building AI models to predict the behavior of the device and look for predictive maintenance insights.

While storage and compute are essentially infinite in the Cloud, we really don't need sub-second time series data transmitted and saved forever. The value of the data quickly degrades with time. Even with 5G to send the data faster, its value is ephemeral.

It's better to collect enough data to build a machine learning model in the cloud, deploy the model back down to the edge, and then run model inferencing close to the source of the data. Keep the stream of high velocity data local and perform video analytics, object detection, or high speed machine characteristics modeling at the edge.

Once you've unlocked the value of distributed edge computing with AI, the business use case possibilities are exciting. The next step is turning that into a Proof of Concept (PoC). Making a few dozen edge devices "smart" makes for a great PoC. Deploying and scaling thousands of edge devices really requires autonomous management techniques like those delivered in the IBM Edge Application Manager.

How does Edge Computing work with distributed AI workloads?

Over any network, 5G networks included, the management of edge devices becomes a problem of scale. How do you deploy, monitor, manage, and reprovision workloads across dispersed edge devices? Building on open source projects that are governed by the Linux Foundation Edge, Open Horizon orchestration software lets software engineers, data scientists, and operators manage thousands of devices by pushing and controlling containerized workloads.

Are there cases when 5G and Edge Computing can work together?

Yes! Imagine an ambulance racing to the hospital with a critically ill or injured patient. 5G video streaming back to the Emergency Room can give critical minutes and greater situational awareness to the ER doctors and nurses. Medical sensors in the ambulance can be streaming the patient's vital statistics to the ER over the 5G network. The EMT first responders can be guided by the medical staff in real time. Medical devices running autonomously on the edge can be assisting the EMTs, doctors, and nurses. As the ambulance arrives at the hospital, those precious minutes of additional care could be the difference in saving a life.

Summary and next steps

In conclusion, I hope these answers will help you plan your network and distributed edge projects over the next several years. Though sometimes discussed together, 5G and Edge are different technologies with different values. Edge is a set of technologies for deployment and orchestration of AI workloads. 5G is a communications protocol for the last 500 meters.

I'm looking forward to applying my latest data science and containerization skills toward managing AI workloads at the edge. I hope, you, like me, want smarter, not just faster.