By Mike Sakalas, Regional Vice President, Australia and New Zealand, Pure Storage
We’re at the forefront of the fourth industrial revolution, powered by innovations in artificial intelligence (AI) and machine learning.
These data-driven technologies are enabling everything from smarter healthcare, to greater understanding of crop disease in farming and improved traffic management.
Even the Australian Government has taken notice; in the recent federal budget it announced a $29.9 million boost over the next four years to Australia’s AI and machine learning capabilities.
It’s a significant investment and a necessary one to ensure that Australian companies remain globally competitive. But unlocking the true potential of AI will only be possible if organisations are equipped to handle the explosive growth of data which fuels it.
Despite tracing its origins back to the 1950s, when Alan Turing developed the Turing Test, it is only until recently that AI’s use has expanded beyond academic projects and the side projects of the world’s largest organisations.
AI has become democratised.
Technology has advanced to a point where AI is now accessible for all.
The current ‘big bang’ of AI adoption is being fuelled by a perfect storm of three key technologies: deep learning (DL), graphics processing units (GPUs) and big data.
Laying the foundations
Inspired by the human brain, DL uses parallel neural networks to effectively write its own software by learning from past examples.
Deep learning technology has already proven to be highly useful in fields where data is less numerical and requires a cognitive approach.
Tasks like speech and audio recognition, language processing or visual understanding likely wouldn’t have progressed as quickly, using standard ML techniques.
GPUs are the second technology behind AI uptake. Modern GPUs, with thousands of cores, are well-suited to running algorithms that loosely represent the human brain. Using the right GPU means data scientists and academics are able to run increasingly complex and detailed AI projects.
Both DL and GPUs are major breakthroughs and game-changing technologies, and when applied to the third piece of the puzzle, big data, the potential for innovation is incredible. However, while DL and GPUs are progressing, many storage technologies have lagged behind.
Consequently, there has been a performance gap between the compute element (DL and GPUs) and the storage, limiting the extent to which companies can capitalise on data that has been growing at an exponential rate.
Unlocking the potential of data through infrastructure innovation
As the size of data sets has increased, moving and replicating data has become prohibitively expensive for many organisations, and a potential bottleneck for innovation. A new model is needed, which is where the data-centric architecture comes into play.
Data-centric architecture is a modern design that puts data at the core of an organisation's infrastructure.
This eliminates the need for data to be moved between old and new systems, and keeps business data and applications in one place, allowing technology to be built around it.
The aim is to bring the compute to data, as opposed to the other way around. This means organisations can spend less time and money on moving data, instead focusing on innovation by making use of these data sets quickly.
A data-centric architecture must operate in real-time, to boost the analytics that power AI.
It also needs to be available on-demand, not requiring constant management. Consolidating and simplifying this through flash, makes it far easier for teams to support the technology that is fuelling tomorrow’s growth.
Recruiting AI for a competitive advantage
By optimising the compute and storage pairing in this way, organisations can combine the speed of locally attached storage with the simplicity, capacity and consolidation of shared storage.
Organisations can now use this optimised approach in compute and storage to support their AI projects.
Gartner predicts AI will be pervasive in almost every software-driven product and service available by 2020. For this to be a reality, organisations must ensure that data is at the core of their IT strategies.
Without the adoption of a data centric architecture, organisations may still be able to utilise the compute power that DL and GPUs offer, but to little effect.
Ultimately, truly successful AI adoption depends on this perfect partnership of compute power and storage. Without it, the full potential of data won’t be realised.