IT Brief NZ - Is Big Data causing huge downtime in enterprise?

Warning: This story was published more than a year ago.
big_datta.jpg

Is Big Data causing huge downtime in enterprise?

Organisations need to ensure their technology infrastructure is ale to handle the load that comes with big data, according to Ixia.

According to the company, the rise of the Internet of Things and the resulting data is providing businesses with enormous opportunities to gain insights like never before, and organisations need to make sure they can handle it.

By 2020, Gartner expects that there will be 25 billion connected ‘things’ in use.  The data produced by this rapidly growing collection of connected everyday devices can deliver significant value to business and in everyday life, Ixia says.

Businesses increasingly understand analytics technology required to make sense of big data. However, Ixia says the network and infrastructure requirements needed to handle big data are often overlooked, along with the additional security, scalability, and visibility potential needed.

"There’s no point having the latest data analytics platform if the data can’t reach its destination securely,” says Stephen Urquhart, general manager ANZ, Ixia.

“Investment in big data analysis is at risk of becoming a big loss for businesses if they don’t make sure the rest of their infrastructure is designed to handle the volume of data involved in big data analysis,” he explains.

“Network infrastructure, security, and visibility must be top priorities for businesses that want to get the most out of the large data sets provided by connected devices,” says Urquhart. “One of the best ways to make sure that all the systems within an organisation’s technology footprint can handle big data is to test it to breaking point before going live with a big data analytics programme.”

Urquhart says there are three major considerations to prevent big data from overloading the organisation’s technology footprint.

Infrastructure

A company’s network infrastructure must be able to handle what big data can throw at it, says Urquhart. “With deficiencies in any area of the infrastructure putting time-to-market and return on investment at risk, infrastructure architects must leverage proactive network test strategies to fully evaluate every decision,” he says.

Security

Urquhart  says it is no longer sufficient to just choose and deploy products designed to address security needs. “Companies need to prioritise deeper insight into their overall security resilience, which can be achieved through testing,” he says.

Visibility

It is only through successful end-to-end visibility that companies can reap the rewards that big data has to offer, says Urquhart. “IT teams need to get the most out of their monitoring tools by taking full advantage of their core capabilities.”

Interested in this topic?
We can put you in touch with an expert.

Follow Us

Featured

next-story-thumb Scroll down to read: