Story image

Exclusive: How CIOs can overcome the challenges of deep learning

Recently IT Brief had the opportunity to discuss AI and deep learning as well a the new Nvidia and NetApp partnership with NetApp A/NZ director Dhruv Dhumatkar.

What are some of the major challenges CIOs face in terms of implementing AI and deep learning? 

CIO’s are being asked to deliver services that enable a competitive advantage for the business, this could be an increase in productivity, new revenue streams, or even new offerings, all delivered at a lower cost.

Organisations are evaluating AI as a means to deliver that advantage. 

From a technology perspective, AI requires massive compute power together with highly scalable/low latency data storage solutions for the complex neural networks that support deep learning and cognitive computing.

Are there any practical ways to overcome these challenges? 

Until recently, the technology to support AI requirements was not available. 

Both NetApp and NVIDIA have invested significant effort into developing powerful platforms that support the sheer performance requirements for AI. 

Technology is no longer a barrier to being able to take advantage of these emerging technologies.

In addition to this, ensuring you have the right skill set within your organisation to help interpret data is really important. It starts with asking the right questions of your data.

What can you tell me about the NetApp and NVIDIA partnership? 

NetApp and NVIDIA have partnered to provide an industry-leading AI platform for Enterprise and Government organisations.

This partnership has really allowed us to push the boundaries of what we thought was possible. 

Can you give me some more detail about NetApp ONTAP AI? 

NVIDIA DGX systems, combined with the NetApp A800 All Flash platforms provide an ideal environment for high capacity, high throughput AI workloads.

What makes this reference architecture different from our competitors? The simple answer is the AI Data Pipeline. 

This provides a unified data management solution for data spanning edge to core to cloud. As mentioned earlier, NetApp delivers common data services across each tier securely and efficiently.

By leveraging the NetApp Data Fabric, ONTAP AI enables enterprises to create a seamless data pipeline that spans from the edge to the core to the cloud. 

This pipeline integrates diverse, dynamic, and distributed data sources, with complete control and protection. 

With massive processing power and capacity, ONTAP AI removes performance bottlenecks and enables secure, nondisruptive access to data from multiple sources and data formats.

GCSB welcomes Inspector-General's report on intelligence warrants
Intelligence warrants can include surveillance, private communications interception, searches of physical places and things, and the seizure of communications, information and things.
Corelight and Exabeam partner to improve network monitoring
The combination of lateral movement and siloed usage of point security products leaves many security teams vulnerable to compromise.
SailPoint releases first identity annual report
SailPoint’s research found that many organisations are lacking maturity in their governance processes over identities.
Disruption in the supply chain: Why IT resilience is a collective responsibility
"A truly resilient organisation will invest in building strong relationships while the sun shines so they can draw on goodwill when it rains."
The disaster recovery-as-a-service market is on the rise
As time progresses and advanced technologies are implemented, the demand for disaster recovery-as-a-service is also expected to increase.
Apax Partners wins bidding war for Trade Me buyout
“We’re confident Trade Me would have a successful standalone future," says Trade Me chairman David Kirk
Avnet to boost AI/IoT solutions with acquisition
The acquisition of Softweb Solutions adds software and artificial intelligence to Avnet’s ecosystem and bolsters its IoT capabilities.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.