Dell releases open source software Omnia, designed for high-performance computing, AI, and data analytics
Dell Technologies has released new open source software, Omnia, to better manage the intersection of high-performance computing, AI, and data analytics.
It's also expanding its EMC PowerEdge server accelerator support to aid organisations in tackling data-intensive workloads such as genome sequencing and product development simulations.
Omnia was developed at the Dell Technologies HPC - AI Innovation Lab in collaboration with Intel and supported by the HPC community. The open source software is designed to automate the provisioning and management of HPC, AI, and data analytics workloads, creating a single pool of resources to meet growing and diverse demands.
The Omnia software stack is an open source set of Ansible playbooks that speed the deployment of converged workloads with Kubernetes and Slurm, along with library frameworks, services, and applications. Omnia automatically imprints a software solution onto each server based on the use case, such as HPC simulations, neural networks for AI, or memory graphics processing for data analytics, reducing deployment time from weeks to minutes.
"We are dedicated to facilitating innovation," says Dell vice president, data center and compute, Chris Kelly.
"Our HPC - AI solutions are enabling renowned organisations across the Asia Pacific and Japan regions, including Kyoto University and AI Singapore, to handle data-intensive tasks including advanced research. Globally, as AI with HPC and data analytics continues to converge rapidly, IT teams are being challenged by siloed storage and networking configurations and providing the required technology resources for shifting demands.
"With the launch of Dells Omnia open source software, teams can dramatically simplify the management of advanced computing workloads, helping them speed up research and innovation," he says.
According to Dell, community involvement and contribution are essential for Omnia's advancement. And Dell Technologies HPC - AI Innovation Lab has worked closely with Arizona State University Research Computing on Omnia development to better support mixed workloads, including simulation, high throughput computing, and machine learning.
"Engineers from ASU and Dell Technologies worked together on Omnia's creation," says Arizona State University, senior director of research computing, Douglas Jennewein.
"It's been a rewarding effort working on code that will simplify the deployment and management of these complex mixed workloads at ASU and for the entire advanced computing industry."
Dell says it now offers NVIDIA A30 and A10 Tensor Core GPUs as options for Dell EMC PowerEdge R750, R750xa, and R7525 servers. NVIDIA A30 GPUs support a broad range of AI inference, and mainstream enterprise compute workloads, such as conversational AI and computer vision. And with NVIDIA A10 GPUs, customers can support mixed AI and graphics workloads on a common infrastructure, ideal for deep learning inference and computer-aided design.