Braincuber

How Generative AI is Fueling Demand for Kubernetes

How Generative AI is Fueling Demand for Kubernetes

Historically, it has not been easy to carry out AI/ML operations on Kubernetes because such workloads usually require high amounts of CPU/GPU resources. 

But things have begun to change. The Cloud Native Computing Foundation (CNCF), a non-profit organization that supports the growth and uptake of Kubernetes, has recently come out with a new version update, Kubernetes 1.31 (Elli).

Elli also brings in new features which are aimed at enablers to conserve resources and enhance their efficiencies, hence, making it possible to cope with the high demands of AI and ML solutions running on Kubernetes.

CNA has been selected by most organizations especially Kubernetes, in managing their AI workload, which has been on the rise in the recent past. Citing a recent survey done by Pure Storage for 500 and above companies, 54% of them stated that they are already using Kubernetes for running AI/ML workloads.

About 72% of them stated that they deploy databases in Kubernetes environments and 67% ran analytics. The intriguing fact is that the figures are prophecies to increase with the upward shift of most if not all corporates towards Kubernetes. This is since building AI and ML models is by nature a loop of cycles involving a lot of trial and error.

“Data scientists frequently adjust and improve models as new training data and model parameters become available. This constant change means that container environments are ideal for supporting such models,” Murli Thirumale, GM (cloud-native business unit), Portworx at Pure Storage, said to AIM.

Steps of Generative AI is Fueling Demand for Kubernetes
Image Source @Braincuber

Kubernetes in the Generative AI Era

June this year was a remarkable month for Kubernetes as it turned ten years, Kibana. What began as a container management system within Google dubbed Borg has now grown to be the Kubernetes is a container orchestration engine that is used in managing containersized applications.

“The key idea of a container is to place any given software in an isolated environment for a certain period of time with the possibility of numerous alterations and guaranteeing the constant performance of the software. The only requirement is that the container can be used with any application as long as it exists in a Linux system,” explained Thirumale.

Another circumstance that determines why containers and Kubernetes are needed in the case of AI/ML models is data volume and the users’ load on the system. There is usually a considerable amount of information processed during the training as opposed to the amount processed during inferencing which may be significantly lower.

These concerns are taken care of in K8S because of its elasticity where resources can be adjusted according to demand. This is the very nature of Kubernetes, which provides an elastic and self-serve infrastructure to be managed, which is ideally suited for applications of AI and ML where demand can vary from time to time,” Thirumale noted.

NVIDIA during this short time became the most valued company in the world and afterward acquired Run.ai, an organisation that develops software aimed at orchestration of workloads via Kubernetes.

As deployments of AI at NVIDIA grow more complex in terms of where the workloads are harnessed, from clouds to the edges and back again into the data centre on-premise management grows in importance.

Also, the acquisition made by NVIDIA points towards the increasing adoption of Kubernetes, as there is a strong requirement for effective orchestration in an environment where AI is distributed in different forms across different types of infrastructure.

Databases Can Run on Kubernetes

Deploying AI solutions across various functions, on the other hand, is also where databases are expected to be very useful for organizations. Experts from AIM the industry have noted that the creating of Red-circled generative AI agents will rely largely on databases.

Up until this point, only a few organizations have focused on training AI models. Most of the remaining enterprises in the globe will be scaling and expanding with their AI models in the very near future. Therefore, of significance importance are database systems, which can scale and offer real-time functionalities.

“Databases are at the heart of every AI/ML process, and today, 54% of such systems are deployed on Kubernetes, which figure is set to increase in the near future. The majority of these applications are data-based systems, including systems such as customer relationship management (CRM) which contains a lot of data that is accessed often but altered rarely compared to active systems such as ATMs which have data that is constantly updated.

AI, ML and analytics are all data heavy therefore the use of Kubernetes especially among such applications is on the rise,” Thirumale noted.

Replacement for VMware

The acquisition of VMware by Broadcom a year ago also changed the trend of using Kubernetes. With Broadcom’s acquisition of VMware, customers grew anxious over potential pricing concerns and how things would work together.

“You can’t do it without it. It’s a package so you have to purchase things you may not want to buy,” said Thirumale.

Citing the same survey, he noted that about 58% of respondents to the survey will also tackle the issue of Kubernetes by saying they will shift some of their VM workloads to it. In addition, approximately 65% of them aiming to do so, plan to do so within the following two years.

Kubernetes Talent

As Kubernetes gets adopted by more and more organizations, the need to hire engineers who can work with this technology will increase too and this will pose a serious threat to organizations as noted by Thirumale. 

“Kubernetes is not what you get as an installation in college. All the education comes in practice. Candidly, for senior IT fangirls, there is a positive attitude towards K8s and also platform engineering, as going one level above. To put it this way, if you are a VMware administrator, storage admin, and you pick up K8 and containers,” he explained.

When he was asked if he thinks education institutions in India should start training students on Kubernetes, he seemed skeptical. He thinks phases can be included into the syllabus but there are too many such growing phases in the world.

“Specialization occurs within the experience; foundational knowledge is provided at the universities. There are also subject-specific trainings and certification courses which can be pursued in addition to the regular college degree,” he concluded.

Organizations looking to enhance their AI processes can benefit from solutions such as Braincuber, which is a comprehensive disability support tool that offers systems’ automation, management, and coding for a Kuber sphere. Braincuber allows the user to control the AI engine, providing smart analytics and automating processes so that creative work is enhanced and not hindered. Kuber undertakes at least the most difficult part of the work, which is deployment and scaling.

Thank you for your interest in Braincuber. We’d like to ask a few questions to better understand your software development needs.

Amazing clients who trust us