Story image

Solving the storage dilemma: Is open source the key?

Business IT is facing storage growth that’s exceeding even the highest estimates, and there’s no sign of it slowing down anytime soon. Unstructured data in the form of audio, video, digital images and sensor data now makes up an increasingly large majority of business data and presents a new set of challenges that call for a different approach to storage.

For CIOs, storage systems that are able to provide greater flexibility and choice, as well as the capability to better identify unstructured data in order to categorise, utilise and automate the management of it throughout its lifecycle are seen as the ideal solution.

One answer to solving the storage issue is software-defined storage (SDS) which separates the physical storage hardware (data plane) from the data storage management logic or ‘intelligence’ (control plane). Needing no proprietary hardware components, SDS is the perfect cost-effective solution for enterprises as IT can use off-the-shelf, low-cost commodity hardware which is robust and flexible.

A research paper by SUSE entitled Managing the Data Explosion Challenge with Open Source Storage found that rising storage costs consistently ranks at the top of business concerns across the industry, but data growth is only one part of a more complex equation. The greatest ongoing cost for IT usually lies in system support and management.

For these reasons, open source storage is an excellent option that is highly customisable and scalable, with a strong community, and a high quality of code. If you’re thinking of making open source storage part of your strategy, consider the four points below

1. Maintain multiple storage vendors  While it can be tempting to take hold of a vendor’s short and medium-term pricing strategy by placing all of your business with them and generating operational simplicity in the process with one set of storage tools and processes, you could be playing poker with your storage budget and gambling that your vendor partners will not punish you with price hikes later.

2. Pay attention to the Cloud war Amazon unquestionably currently has the lead in adoption over Microsoft Azure and Google Compute. Nevertheless, everyone knows that Amazon is playing a ‘long game’ of profit tomorrow, not today. Hence, many have a foot in Azure or Google Compute, even when they have a leg in AWS because there must be an exit plan. But this comes with a price – and that price is operational complexity – a price that can be particularly high in the world of storage, where the new pricing models can be about how much data you move down the wire rather than how much you own.

3. Maintain and expand your skill sets to avoid lockdown  It’s tempting to reduce complexity by standardising on a small set of suppliers. The upside is that you get simplicity – one approach to storage means it’s easy to train staff, some are no longer necessary in a cloud scenario, and arguably, you can get on with that ‘core business’ proprietary vendors love to tell you about of serving customers. However, the downside is if you don’t know how to exit AWS to move to Azure without crippling operations or if you don’t know how much it costs to repatriate data, and if you’ve got nowhere to put it when you do, you are locked down and at the mercy of suppliers.

4. Use open source software-defined storage — or pay more  If you use only cloud or only proprietary software, your software and hardware costs will always be greater than they need to be. This is a simple fact. Open source means costs savings from moving to commodity hardware, and the total elimination of proprietary software costs. Proprietary storage vendors will tell you, rightly, that cost can reappear as skilled headcount, consultancy and support. But then, if you don’t have skilled headcount, how are you going to maintain your capability to switch cloud providers, and how are you going to assess which vendors to use? 

Research and consider your options thoroughly and consider open source software-defined storage, as it can provide organisations with a highly scalable solution that drastically reduces storage costs in both capital cost and operation expenditure while providing greater adaptability and simplicity in the management of your storage environment.

Article by SUSE APAC chief technologist Peter Lees

Attacks targeting Cisco Webex extension explode in popularity - WatchGuard
WatchGuard's Internet Security Report for Q4 2018 also finds growing use of a new sextortion phishing malware customised to individual victims.
SAS partners with NVIDIA on deep learning and computer vision
“By partnering with NVIDIA, we combine our strengths to augment human intelligence and realise the true potential of AI.” 
Why businesses must embrace automation to ensure success
“For many younger workers, the traditional view of a steady job at one company, perhaps for life, simply doesn’t reflect reality."
TYAN unveils new inference-optimised GPU platforms with NVIDIA T4 accelerators
“TYAN servers with NVIDIA T4 GPUs are designed to excel at all accelerated workloads, including machine learning, deep learning, and virtual desktops.”
Worldwide spending on security to reach $103.1bil in 2019 - IDC
Managed security services will be the largest technology category in 2019.
Microsoft appoints new commercial and partner business director
Bowden already has almost a decade of Microsoft relationship management experience under her belt, having joined the business in 2010.
How Cognata and NVIDIA enable autonomous vehicle simulation
“Cognata and NVIDIA are creating a robust solution that will efficiently and safely accelerate autonomous vehicles’ market entry."
Kinetica launches a new active analytics platform
"With the platform now powered by NVIDIA DGX-2, customers can build smart analytical applications that combine historical data analytics and ML-powered analytics."