IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
Riding the data wave: How to choose the right technology for the right workload
Mon, 11th Dec 2017
FYI, this story is more than a year old

Behind a cacophony of buzzwords in technology is the very real need for CIOs and technologists to choose the right technology mix to support rapid adoption of the data and workload heavy technology.

The need is urgent as Australian companies are embracing new technology and integrating Internet of Things devices throughout every facet of their organisations.

Telsyte has predicted five to ten times as many connected devices will be used in their organisations within the next five years. This is generating a swell of data that needs to be housed, transmitted and processed before any actual value can be realised.

Companies are looking for ways to manage the fluctuation and variation in workloads efficiently so that they can start tapping into the insights data promises to deliver.

Scaling-out technology platforms is no longer about adding more boxes that have the same, standardised components, it requires a careful review of the computing workloads with custom systems designed to manage demands.

The evolution of data processing

We are seeing data streams generated by smartphones, autonomous vehicles, heavy-duty machinery in factories and every other area of life pour into data centers.

The challenge now is to design systems that provide maximum processing with minimal over provision of resources going to waste.

While system speed will always be critical to managing intensive workloads, it is equally important to have systems that are designed for the specific workloads being processed.

Different workloads require different basic resources to process efficiently, system design needs to take this into account. In addition, hybrid cloud systems need intelligent workload management to ensure ongoing optimization.

For technology professionals, workload management now sits at the center of every decision.

From managing different workloads on different platforms, making sure that each workload is on the best part of a hybrid cloud platform, getting the workload processing timing right, and aligning DevOps activities, managing workloads manually is no longer an option.

Designing systems to carry the load

To be able to manage integrating on-premise and cloud environments technology departments need team members who understand how to manage both technology types and make decisions based on workloads appropriately.

This is especially important as businesses move from all public cloud to a mix of cloud and on-premise technology. Recently, Technology Business Research (TBR) found hyperconverged and consumption-based pricing models will shift workloads from public cloud to on premises.

To support this transition, use of partners is critical. Of the companies TBR surveyed as part of its Hybrid Cloud Customer Research, 52 per cent indicated they are working with a system integrator or broker to complete their initial hybrid purchases.

Hybrid delivers the necessary agility with consistent performance to enable enterprises to move workloads to where they make the most sense for the business.

As data volumes continue to swell, hybrid enables the combination of data streams from different locations resulting in deeper and more meaningful analysis.

Leading with the edge

Beyond hybrid cloud, we are seeing a shift towards edge computing. Transferring the volume of data being generated presents a challenge for many networks, edge computing addresses this.

As the name suggests, data analysis occurs close to the source of generation instead of first being moved to a centralized cloud.

This means that time is not lost transferring large data sets from the device to the cloud for processing and then transferring the insights back to the machine. Instead, users can act on near real-time analysis of machine performance to drive incremental improvements.

Remote locations such as oil rigs and solar farms can suffer from insufficient network connectivity or bandwidth and high latencies. In these instances, edge computing is the only viable option.

A hybrid cloud and edge solution processes and stores most raw IoT data close to the source, with important or refined data being uploaded during low usage windows to the cloud for long-term storage.

This provides timely responses to events while further analysis is still possible with the refined data.

Autonomous vehicles are an example of edge that is a little closer to home. These cars react to the data its sensors output immediately to avoid collisions, however, the refined data on performance and incidents remains valuable for later analysis.

As we move into 2018, technology managers need to be prepared to oversee the rise of IoT devices and the workload processing this entails.

Meeting this challenge efficiently will require hybrid approaches integrating cloud and on-premises processing to stay ahead of the data deluge.