In the last few years, with the emergence of 5G, IoT and AI, cloud providers have recognized the value and the importance of edge, and started building the infrastructure technologies to provide computing and connection between cloud and edge, as well as edge based intelligent services as part of their total cloud value proposition. However, this is a relatively new area of cloud computing with great opportunities as well as challenges. Â
Top Scenarios of Cloud-Edge ComputingÂ
The new set of applications such as 5G, IoT and AI, and maximizing cloud computing resources have been driving the development of edge computing as extension of cloud.
Here are the major scenarios for such cloud-edge computing development:Â
- Low latency: For latency sensitive services such as 5G and IoT, running on the edge provides much lower latency and faster response time. We have seen more and more latency sensitive services being built around the public cloud ecosystem.
- Massive data: With the rapid growth of devices and users, we’re generating massive amount of data every day. It is very costly to move device or user generated data for services to run on the cloud in real time or on a regular basis. Instead, it is more efficient to run services on the edge and to perform local data processing. In some cases, it is the only viable option and can save tremendous network cost.
- Privacy/security: For some businesses or regions, it is required to store sensitive and/or confidential data locally on the edge (without moving to cloud) to comply with privacy law and/or security regulations.
- Distributed cloud architecture: At the same time of adopting public cloud services, many enterprises have needs to run part of their business on locally edge servers, which challenge cloud providers to design and provide a distributed cloud architecture to their users.Â
Top Use Cases: Edge Analytics and Edge AIÂ
One of the top cloud-edge computing opportunities is edge analytics, which allows data analysis and decision making closer to where the data is generated such as IoT devices, edge and gateways while the cloud is still responsible for the overall service life cycle management, service scheduling, data storage or warehousing, and more comprehensive service or data analysis such as big data analytics. The key benefits of edge analytics are reduced latency and increased security of the analytics, and hence, quick business decision making and so on.
Similarly, edge AI enables applications such as autonomous navigation, remote monitoring using NLP or facial recognition, video analytics, and etc. by embedding AI algorithms in AI devices connected to distributed, low-latency and reliable AI services on the edge, while cloud resources can be pulled in for more in-depth analysis. Here is an image recognition example using both edge and cloud resources and technologies: The edge renders the initial image collected from the device, performs initial analysis and determines it is a dog in the image, then the edge sends the extracted dog image, rather than the whole original image file, to the cloud for further analysis. The cloud then utilizes a bigger data set and determines the breed of the dog and sends the information back to the edge as a more detailed description of the image.Â
Top ChallengesÂ
Cloud-edge computing obviously brings many technical challenges to cloud providers, especially in support of heterogeneous computing, network instability and security on the edge.
Here are top challenges we have seen:
- Limited amount of resources on the edge. The services deployed on the edge need to be more lightweight and less resource intensive.
- The IoT devices on the edge can be massive and diverse, which can lead to device management challenges for the edge.
- The connectivity between the cloud and the edge can be unstable. It is important to implement well thought-out network protocols to ensure the reliability of data transmission between the cloud and the edge.
- There needs to be a streamlined security strategy connecting IoT to the edge, and to the cloud.
- Since this is still a relatively new area with great promises and competing visions, there are many disconnected technologies, point solutions and fragmented communities addressing this space. This can be challenging for users to determine the best architecture and solution stack for their edge cloud computing use case.  Â
KubeEdge and Volcano, Cloud Native Projects that Extend Kubernetes to Edge and AI
Cloud native computing has been considered as the de facto approach for the cloud, edge and AI application and service development, and Kubernetes has been proven to be the cornerstone of cloud native computing today. KubeEdge, a CNCF Sandbox project launched in July 2018 with increasing user and developer support, extends Kubernetes to the edge. KubeEdge 1.2, just released in February 2020, includes many reliability features that addresses the cloud-edge network challenges and improves the cloud-edge connectivity. Volcano is an open source project that extends Kubernetes to meet the AI and deep learning workloads. Volcano has recently received the CNCF Runtime SIG approval, and is currently applying for CNCF Sandbox status.
Please join KubeEdge and Volcano, and let’s work together to complete cloud-edge-IoT streamlined solutions to meet the opportunities that 5G and AI can bring us today and in many more years to come.
— Anni Lai