
Red Hat unveils AI, edge & cloud upgrades in OpenShift 4.17
Red Hat has announced updates to several of its key platforms, highlighting enhancements in artificial intelligence (AI), edge computing, hybrid cloud, security, and developer productivity.
Speaking about OpenShift AI, Jeff DeMoss, Director, AI Product Management, described recent developments to the platform. "At a high level, it's an AI platform that enables enterprises to create and enabled applications across hybrid cloud environments. It supports both predictive AI and Gen AI use cases and provides a broad set of functionality across the AI life cycle including areas like model development, training, serving, monitoring, and automation for AI workflows."
DeMoss emphasised the new model registry, explaining, "The model registry provides a central repository to manage versions, metadata, and model artifacts. It contributes to the overall MLOps workflow by enabling teams to collaborate on models and then get them into production more efficiently. It can also help with model governance by providing info on model versions, documentation, where a model came from, what data sets were used, and model evaluation metrics."
He added, "Customers can have multiple registries within their organization, define permissions to control access at either the user or group level, and deploy models directly from the registry. And just one note, we developed this feature within the CubeFlow open source project."
The platform also now supports the VLM serving runtime for large language models (LLMs). DeMoss noted, "VLM is a flexible runtime that supports most popular open-source models on hugging face including Llama, Mixtrol and Mistrol. It also has many capabilities and optimizations that improve performance for serving LLMs. In Open Shift AI, we use the most updated version of the LLM which supports the latest model architectures and multimodal models like vision language models."
Enhancements also address security and efficiency. "We've also enhanced model serving by adding capabilities for securing model endpoints and adding support for Model cars use OCI images to streamline the process of fetching models for deployment. This improves performance for serving LLMs, especially in environments where you want to autoscale resources based on inference requests," said DeMoss.
There are additional tools for experiment management, batch training, and tuning within model development, alongside features for data drift and bias detection. DeMoss explained, "Data drift detection allows data scientists to detect differences between production data and training data. This feature continuously monitors input data to help maintain model quality over time. And then our the-box bias detection visualization helps data scientists monitor whether their models are fair and unbiased."
Kirsten Newcomer, Senior Director, Hybrid Cloud Platforms, discussed Red Hat's placement as a leader in Gartner's first cloud application platform magic quadrant. Newcomer said, "We're super excited to see this. With OpenShift 4.17 we continue our investment in core Kubernetes and also our investment in security which remains a particular focus for Red Hat."
She described the company's drive to make technology adoption more manageable. "One of our key focuses as well beyond core coupube and security is making it easier for our customers to adopt modern technologies by reducing the cognitive overload that the pace of change in the software industry has created in recent years and that change just has accelerated with the popularity of generative AI," said Newcomer.
OpenShift 4.17 also sees enhancements for virtualisation and confidential computing. "For security, we continue our investment in confidential computing. We've added a new operator that is tech preview that provides attestation services for containers enabling customers to protect data in use," Newcomer stated.
Network security has also advanced. "One of the big tech preview items in 417 is our delivery of userdefined networking also known as native network isolation for namespaces. We continue to deliver the default network of OVN Kubernetes but this feature adds VRF support for isolated by default userdefined networks," Newcomer indicated.
On edge computing, Shobhan Lakkapragada, Senior Director, Product Management, Edge, explained the intent of Red Hat Device Edge. "Our overall design goal when we introduced this was to create an edge computing platform that gives customers two key benefits. One is operational consistency and the second one is flexibility in being able to deploy traditional or modern applications," he said.
Lakkapragada highlighted upgrades in Red Hat Device Edge 4.17, including support for low-latency and AI workloads. "The first one is better support for low latency workloads and the second one is better support for edge AI workloads... we are also introducing a new build of Red Hat device edge that is pre-tuned preconfigured for industrial control software," he noted.
Regarding AI at the edge, he continued, "We are announcing a tech preview of Red Hat device edge with Nvidia Jetson and IGX Orin system on chips... We have partnered with Nvidia to bring the power of Red Hat stack as well as Nvidia right AI system on chips for these edge environments." IPv6 support and image mode for edge devices were also included in this update.
Balaji Sivasubramanian, Senior Director, Developer Tools, reported progress with the Red Hat Developer Hub since its earlier launch. "We have over 20,000 users, developers signed up, close to 25,000, tens of customers, big enterprise customers that are adopting to help the developer productivity use case. Today I want to focus in this particular call focus on how we are enabling enterprises adopt AI in their development as well as their business," he stated.
On facilitating AI integration, Sivasubramanian explained, "as part of the redhead developer hub we're introducing templates that allows them to really get a quick start like all they have to do is click a couple of buttons few buttons and then they're ready to go start coding their unique use case for that particular AI application and that really saves a lot of time."
He added, "Everything can be made available as part of dollar catalog for AI assets and that's a very kind of an accelerating your journey for enterprises not only creating new applications but also be able to like now that I have it how do I reach out to all the developers so they know how to use all of these assets and applications. and we have more to come and in the future releases we plan to add more AI into the product be able to use AI to improve our productivity."
Panelists also discussed opportunities for partners and system integrators. Newcomer said, "it's just a huge opportunity for SIS consultancies to help customers make that migration and help the teams who own are responsible for open shift the teams responsible for the apps running in VMs help them to adopt this environment become comfortable with it become familiar and it really sets an organization on a path that improves their ability to modernize their applications because they can become familiar with the environment even if they've not yet ready to move into microervices or other types of more modern applications."
Lakkapragada added, "there's a lot of if transformation industry specific knowledge right that our SI partners and MSPs even right can bring and as Red Hat right we are very much interested in partnering with SIS and consultants to expand right into this new market right we are relatively new in this space I would say a few years so a lot of the end customers in this space are lines of business decision makers, right? And they're seeing all the transformation that happened in IT world and they want to bring the same thing to operations technology worlds, And I think this is where SIS can play a big role in helping end customers through that transformation."
Sivasubramanian commented, "Developer hub essentially is complete customization for enterprises to build their own internal developer portal that suits their needs and there's a tremendous opportunity for these value added service SI partners to be able to take it and offer to their customer and customize for their use case."
DeMoss concluded, "there are tremendous opportunities for SIS to develop domain or industry specific solutions. So they can leverage their knowledge in a specific domain or industry and instead of just having generic use cases, they can solve more packaged use cases and patterns that they're seeing that are unique to specific verticals."
Additional questions from participants covered topics such as integration of MLOps into software engineering teams, the progress and adoption of edge computing and web assembly, and the use of confidential computing in OpenShift user environments.