IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image

Why real-time AI remains a challenge

Wed, 1st Aug 2018
FYI, this story is more than a year old

Raise the possibility of real-time Artificial Intelligence (AI) and the issue of processing power inevitably follows hot on its heels.

But while extracting actionable insights in the shortest possible timeis a business imperative for organisations that hope to remain competitive, computing power is only part of the challenge.

According to the 2016 Gartner Market Guide for Self-Service Data Preparation analytics users spend the majority of their time either preparing data for analysis or waiting for data to be prepared for them.

This exposes a crucial but often overlooked aspect of the data processing function – data quality. It is imperative that the insights gained from analytics and AI are not just quick, but also accurate and reliable.

Staying ahead

Why does speed matter? Quite simply because markets move fast and databases decay – at the rate of more than 20% a year, according to one report. To get relevant actionable insights, businesses need to keep up by replacing obsolete data with current data as quickly as possible.

With data informing every aspect of operations – it's already been deployed in some form by nine in 10 leaders of large Australian businesses, according to 2018 research from Infosys – ensuring data quality is paramount. Real-time analytics and AI help organisations adjust offers and pricing to current market events, target new and existing customers with tailored options and respond to competitors' offerings.

The ability for employees to access data on demand is key to a successful AI strategy. This is why self-service analytics, cloud-based data access, data preparation and data integration are so important for today's businesses. With more data in the cloud than ever before, companies need to take a new, cloud-first approach to data management. In an ideal world, this involves creating a carefully balanced data environment which ensures data privacy and protection, while offering the right people the access they need, when they need it.

Eliminating poor data

Gartner estimates poor data costs organisations, on average, $15 million every year. The Harvard Business Review credits this to what it dubs “hidden data factories”, where departments end up having to check and correct data that's already ‘in the system'.

In a world where data has been termed the ‘new oil', it makes sense for businesses to ensure their information is top quality but current processes for doing so are tedious and inefficient. Data deluge, in the form of vast amounts of unstructured, unvetted data, leads to organisations only mining a fraction of what's available to them. And if data scientists do invest significant time refining and preparing data, the insights gleaned may no longer be timely.

Adopting an augmented intelligence approach

When it comes to exploiting structured data, de-duplication of databases is a critical but time-consuming task. AI can be used to slash the time this takes by cleansing duplicated data sets, provided the data is perfect. Unfortunately, in the real world, it rarely is. In many scenarios, augmented intelligence – a combination of AI and human smarts – is likely to be called for.

Here's an example of how it can work. Natural Language Processing (NLP) makes it possible to teach machines to understand natural human language, whether verbal or written, within unstructured data sets. By extracting information, including names and phone numbers, from unstructured data sets such as email threads and notes pages, NLP can contribute to a data quality strategy.

This practice can be employed to good effect in what could be dubbed the ‘lazy sales rep' scenario. The Salesforce system is commonly used to store useful information such as phone numbers and job titles but until now this data has been largely unexploited. Harnessing AI can change this. By labelling words within an unstructured data set, users can give analytics programs the information that's needed to automate tasks and extend the outcomes of certain labels across larger data sets.

The upshot of this is that previously unexploited data can be put to use within the business. It's an example of humans helping machines to help humans – or augmented intelligence in action.

Follow us on: