IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Vini(2)

Cloudera brings AI inference & analytics on premises

Tue, 10th Feb 2026

Cloudera has extended its AI inferencing and data access products to run in customer data centres, as more organisations seek to keep sensitive data on premises while rolling out AI systems beyond pilot projects.

The update brings Cloudera AI Inference and Cloudera Data Warehouse with Trino to on-premises deployments. It also adds new AI-focused features to Cloudera Data Visualization for analysis, administration, and audit trails.

Enterprises across Australia and New Zealand face rising scrutiny over data handling, particularly in sectors with strict regulatory obligations. Australia recorded more than 500 data-breach notifications in the first half of 2025, with healthcare and financial services among the hardest hit, according to figures cited in the announcement.

Vini Cardoso, Chief Technology Officer of Cloudera Australia and New Zealand, said the direction reflects governance and risk concerns in the region.

"For ANZ businesses, Cloudera's new capabilities directly address today's toughest pressures: tightening regulation, data sovereignty and escalating cyber risk. In highly regulated sectors like financial services and healthcare especially, they deliver what matters most: control. Sensitive data stays in-house, security is strengthened amid the potential of rising breaches, and compliance obligations are met. Combined with strong governance, auditability and cost predictability, they enable enterprises to scale AI efficiently - providing a secure, resilient foundation for innovation in an environment with mounting risks."

On-prem inference

Cloudera AI Inference is now available on premises. The product handles model serving for AI applications and supports use cases including large language models, fraud detection, computer vision, and voice-related workloads.

The on-premises release targets organisations that want to deploy AI where their data resides, without moving information into external environments. It is also positioned as a way to better manage latency, compliance requirements, and data privacy in production systems.

The on-premises version uses NVIDIA components, including NVIDIA Blackwell GPUs, Dynamo-Triton Inference Server, and NVIDIA NIM microservices. It also supports NVIDIA Nemotron open models.

NVIDIA framed the partnership around deploying inference within data centres and managing the costs of running AI at scale.

"The value of enterprise data is realised when AI can be securely and flexibly deployed where that data lives," said Pat Lee. "Our collaboration with Cloudera enables customers to deploy and scale AI inference using NVIDIA Blackwell GPUs, Dynamo-Triton and NIM microservices, delivering control, predictable economics, and data-centre efficiency."

Trino integration

Cloudera Data Warehouse with Trino is also moving into data centre environments. Trino is an open-source query engine that enables queries across different data sources. In Cloudera's implementation, it sits alongside governance and monitoring functions that span a company's data estate.

The aim is unified security, governance, and observability across data held in different places, and faster access to analytics when data is spread across platforms.

Cloudera cited its own research showing nearly half of companies store data in a data warehouse. That pattern has increased demand for tools that run analytics and AI while reducing the need to move data between environments.

Visualisation changes

Cloudera Data Visualization has received AI-related updates aimed at day-to-day analytics work. One new function generates summaries and contextual notes for charts and visuals. Other changes add resilience features to address transient issues, along with usage analytics for monitoring.

The update also adds AI query logging and traceability. The system records message IDs, timestamps, and questions, which Cloudera said improves transparency and speeds issue resolution.

Administration has also been updated. Administrators can assign admin roles using revised configuration parameters. The change supports single sign-on setups and removes the need for hard-coded credentials and manual user promotion.

Leo Brunnick, Chief Product Officer at Cloudera, said the releases give customers more choice over where AI and analytics run.

"These advancements provide our customers with a superior level of control and flexibility," Brunnick said. "With Cloudera AI Inference, Cloudera Data Warehouse with Trino, and Cloudera Data Visualization all accessible in the data centre, organisations can securely deploy AI and analytics exactly where their most critical data resides. This means enterprises can drive innovation and derive insights without compromising on data security, compliance, or operational efficiency."

The additions come as more enterprises move from experimentation to production deployments and revisit architecture choices across multi-cloud, edge, and on-premises environments.