AI portfolio brings enhancements to Red Hat OpenShift AI and Red Hat Enterprise Linux AI to operationalize AI strategies

Red Hat today released new updates to Red Hat AI, its portfolio of products and services designed to accelerate the development and deployment of AI solutions across the hybrid cloud. Red Hat AI offers an enterprise AI platform for model training and inference, providing greater experience, flexibility, and a simplified experience for deploying systems anywhere across the hybrid cloud.
In the quest to reduce the costs of implementing large language models (LLMS) to meet a growing number of use cases, companies still face the challenge of integrating these systems with their proprietary data and accessing it from anywhere: whether in a data center, in the public cloud or even at the edge.
Integrating both Red Hat OpenShift AI and Red Hat Enterprise Linux AI (RHEL AI), Red Hat AI addresses these concerns by providing an enterprise AI platform that enables more efficient and optimized models, tuned to business-specific data, and can be deployed across the hybrid cloud to train models on a wide range of compute architectures.
“The update enables organizations to be both accurate and cost-effective in their AI journeys,” said Joe Fernandes, vice president and general manager of Red Hat’s AI business unit. “Red Hat knows that enterprises will need ways to manage the rising cost of their generative AI deployments as they bring more use cases into production and operate at scale. Red Hat AI helps organizations address these challenges by enabling them to deploy more efficient, purpose-built models that are trained on their data and enable flexible inference across on-premises, cloud, and edge environments.”
Red Hat OpenShift AI
Red Hat OpenShift AI offers a complete AI platform for managing predictive and generative AI (gen AI) lifecycles across the hybrid cloud, including machine learning operations (MLOps) and large language model operations (LLMOps) capabilities. The platform provides functionality for building predictive models and tuning gen AI models, along with tools to simplify AI model management, from data science pipelines and models to model monitoring, governance, and more.
The latest version of the platform, Red Hat OpenShift AI 2.18, adds new updates and capabilities to support Red Hat AI’s goal of bringing more optimized and efficient AI models to the hybrid cloud. Key features include:
● Distributed service: Available through the vLLM inference server, distributed serving enables IT teams to split model serving across multiple graphics processing units (GPUs). This helps alleviate the load on a single server, speeds up training and fine-tuning, and promotes more efficient use of compute resources, while also helping distribute serving across nodes for AI models.
● End-to-end model tuning experience: Using InstructLab and Red Hat OpenShift AI data science pipelines, this new capability helps simplify fine-tuning of LLMs, making them more scalable, efficient, and auditable in large production environments, while delivering management through the Red Hat OpenShift AI dashboard.
● AI Guardrails: Red Hat OpenShift AI 2.18 helps improve the accuracy, performance, latency, and transparency of LLMs through a preview of AI Guardrails technology, which monitors and protects user input interactions and model outputs. AI Guardrails provides additional detection capabilities to help IT teams identify and mitigate potentially hateful, abusive, or profane speech, personally identifiable information, competitor data, or other data restricted by corporate policies.
● Model evaluation: Using the language model evaluation (lm-eval) component to provide important insights into the overall quality of the model, model evaluation enables data scientists to compare the performance of their LLMs across a range of tasks, from logical and mathematical reasoning to adversarial natural language, helping to build more effective, responsive, and adaptive AI models.
RHEL AI
Part of the Red Hat AI portfolio, RHEL AI is a foundational modeling platform for developing, testing, and running LLMs more consistently to power enterprise applications. RHEL AI offers Granite LLMs models and InstructLab model alignment tools, which are packaged into a bootable Red Hat Enterprise Linux image and can be deployed across the hybrid cloud.
Released in February 2025, RHEL 1.4 brought several improvements, including:
● Granite 3.1 8B model support as the latest addition to the open-source Granite family of models. The model adds multilingual support for taxonomy/knowledge inference and customization (developer preview), as well as a 128k context window to improve adoption of summarization results and Retrieval-Augmented Generation (RAG) tasks.
● New graphical user interface to contribute to prior skills and knowledge, available in developer preview form, aims to simplify data consumption and fragmentation, as well as enable users to add their own skills and contributions to AI models.
● Document Knowledge-bench (DK-bench) to facilitate comparisons between AI models tuned with relevant private data with the performance of the same untuned base models.
Red Hat AI InstructLab on IBM Cloud
Increasingly, businesses are looking for AI solutions that prioritize the accuracy and security of their data while keeping costs and complexity as low as possible. Red Hat AI InstructLab, available as a service on IBM Cloud, is designed to simplify, scale, and help improve the security of training and deploying AI systems. By simplifying the tuning of InstructLab models, organizations can build more efficient platforms tailored to their unique needs while maintaining control of their sensitive information.
Free Training on AI Fundamentals
AI is a transformative opportunity that is redefining how businesses operate and compete. To support organizations in this dynamic landscape, Red Hat is offering free online training on AI Fundamentals. The company is offering two AI learning certificates, geared toward both experienced senior leaders and beginners, to help educate users of all levels on how AI can help transform business operations, accelerate decision-making, and drive innovation.
Availability
Red Hat OpenShift AI 2.18 and Red Hat Enterprise Linux AI 1.4 are now available. More information about additional features, improvements, bug fixes, and how to update your Red Hat OpenShift AI version to the latest version can be found here on here, and the latest version of RHEL AI can be found on here.
Red Hat AI InstructLab on IBM Cloud is coming soon. Red Hat AI Fundamentals training is now available to customers.
About Red Hat
Red Hat is the world’s leading provider of enterprise open source solutions, with a community-driven approach to delivering highly reliable, high-performance technologies such as Linux, hybrid cloud, containers, and Kubernetes. Red Hat helps customers integrate new and existing IT applications, deliver cloud-native development, standardize and automate the industry’s leading operating system, and secure and manage complex environments. Renowned support, training, and consulting services make Red Hat the trusted advisor to Fortune 500 companies. As a strategic partner to cloud and application providers, system integrators, customers, and the open source community, Red Hat helps organizations prepare for the digital future.













