Adoption of Container Orchestration Advancements in Telecom: Load balancing and scaling of 5G traffic

By Content

  • 05 Apr 2023

The technological world is constantly advancing, demanding applications that are not only robust but also highly scalable. Such is the case with Telecom 5G Applications. During peak traffic, traditional approaches with a fixed number of replicas wouldn't suffice. Scaling up unnecessarily during idle periods would incur costs for no reason, while minimal replicas could lead to lost traffic during peak times. With the advent of containerization technologies like Docker and Kubernetes spawning an application backend container has become very easy and quick. In addition, Kubernetes also provides auto scaling functionality.

As a specialist in cloud computing and working in the field of Telecommunications, Pallavi Priya Patharalagadda had the opportunity to participate in projects that explore the potential of containerization technologies in achieving this goal. Some of her key contributions involved addressing a critical bottleneck in a high-traffic scenario, and the integration of 4G and 5G core components to Docker and Kubernetes. She suggested Kubernetes' horizontal pod autoscaling (HPA) feature as the solution. By configuring CPU or memory usage thresholds in the YAML configuration file of Kubernetes, the engineers could make Kubernetes to automatically scale the application based on real-time traffic demands. When load increased, new pods were spun up to handle the surge, ensuring a seamless user experience. This resulted in a remarkable improvement in performance, with application performance jumping from 50% to 85%.

However, the journey of Pallavi’s teammates didn't end there. While HPA addressed the pod-level scaling, a new challenge emerged: packet drops at the service level. With multiple interfaces feeding into a single service, traffic spikes overloaded the service itself, leading to dropped packets despite autoscaling the backend pods. Here, the power of Kubernetes service load balancing came into play. Four individual load balancer services were created, each with its own IP address, all pointing to the same backend pods with autoscaling enabled based on CPU load. This distributed the incoming traffic across multiple entry points, eliminating bottlenecks and achieving a significant milestone – zero packet drops.

It was a versatile one as it did not only guarantee a stable and effective user experience but also paved the way for the expansion of customer base. The site’s enhanced capacity to accommodate more traffic helped to raise revenue by 25%. These were outlined in Pallavi’s research papers with the titles: “Resolve Bottlenecks at Kubernetes Using Load Balancer Services” and “Dynamic Scaling of Application Using Kubernetes Horizontal Pod Auto Scaling”, both published in scholarly journals.

Looking ahead, container orchestration technologies like Kubernetes, OpenShift, Docker are likely to play an increasingly central role in application development. Integration with package managers like Helm will further streamline the deployment process, while advancements in service mesh technologies will facilitate communication and management of microservices within a Kubernetes cluster.

Pallavi’s experience highlights the transformative potential of Kubernetes in building scalable and resilient applications. It is believed by industry experts that as the digital landscape continues to evolve, embracing these technologies will be key for businesses to stay ahead of the curve and deliver exceptional user experiences.

This content is produced by Rahul Sharma.