What is Edge Computing?

You’ve probably heard us talk about the “edge,” a lot… lately. But what is an “edge” and what do we mean by edge computing?

In simple terms an “edge” is where a network connects with people using it. The cellular network that your phone is currently connected to wirelessly, that’s an “edge.” There are a lot of things we do at the “edge” to make the most of our network… like edge computing.

Take a look at your smartphone. Right now it powers games, videos, and apps… maybe even augmented or virtual reality on its own. But this technology uses up a lot of battery and typically relies on computing technology within the device itself.

What if that computing technology was moved to the”edge?” Using edge compute with 5G your devices can do more with less… without sacrifice performance. And it’s not just your phone… smartwatches, drones, anything IoT can use this.

Edge computing software and hardware are being added to places where it will have the most impact, our network edge, and even places like hospitals, sports stadiums, and retail stores. At its heart edge computing makes things like gaming, AR, VR, drone navigation or artificial intelligence better.

In order to understand what edge computing is all about, we need to first understand the role played by cloud computing in industry.

In a typical plant, you’d have machines that are generating data and sending it to the cloud via a gateway that routes the data straight to a storage service or to an analytic software. The job of the analytic software would be to convert the data into something digestible by a real-time dashboard or a machine learning algorithm.

In the case of machine learning, the algorithm is trained using the data, and out of that comes a model. A model that describes all things that the machine has experienced in the past. So, this is useful in the sense that you can then use the model in real time, such that when the machine encounters an undesired condition you can take action before it causes damage by sending a control signal to the machine at the factory floor.

So, the idea of edge computing is that instead of sending all the data upstream and have all this activity happening at the cloud, why not push all of these components to the edge closer to the factory equipment? So that we can have these decisions made right at the source of the data in order to reduce the cost of transferring data, benefit from unlimited bandwidth, and be able to employ these technologies in mission-critical applications. Which would have otherwise been difficult to do because of the latency involved in sending and receiving data from the cloud.

This, in essence, is what edge competing is all about.

You might like