Story image

Why real-time AI remains a challenge

Raise the possibility of real-time Artificial Intelligence (AI) and the issue of processing power inevitably follows hot on its heels. 

But while extracting actionable insights in the shortest possible time frame is a business imperative for organisations that hope to remain competitive, computing power is only part of the challenge.

According to the 2016 Gartner Market Guide for Self-Service Data Preparation analytics users spend the majority of their time either preparing data for analysis or waiting for data to be prepared for them.

This exposes a crucial but often overlooked aspect of the data processing function – data quality. It is imperative that the insights gained from analytics and AI are not just quick, but also accurate and reliable. 

Staying ahead

Why does speed matter? Quite simply because markets move fast and databases decay – at the rate of more than 20% a year, according to one report. To get relevant actionable insights, businesses need to keep up by replacing obsolete data with current data as quickly as possible.

With data informing every aspect of operations – it’s already been deployed in some form by nine in 10 leaders of large Australian businesses, according to 2018 research from Infosys – ensuring data quality is paramount. Real-time analytics and AI help organisations adjust offers and pricing to current market events, target new and existing customers with tailored options and respond to competitors’ offerings. 

The ability for employees to access data on demand is key to a successful AI strategy. This is why self-service analytics, cloud-based data access, data preparation and data integration are so important for today’s businesses. With more data in the cloud than ever before, companies need to take a new, cloud-first approach to data management. In an ideal world, this involves creating a carefully balanced data environment which ensures data privacy and protection, while offering the right people the access they need, when they need it. 

Eliminating poor data

Gartner estimates poor data costs organisations, on average, $15 million every year. The Harvard Business Review credits this to what it dubs “hidden data factories”, where departments end up having to check and correct data that’s already ‘in the system’. 

In a world where data has been termed the ‘new oil’, it makes sense for businesses to ensure their information is top quality but current processes for doing so are tedious and inefficient. Data deluge, in the form of vast amounts of unstructured, unvetted data, leads to organisations only mining a fraction of what’s available to them. And if data scientists do invest significant time refining and preparing data, the insights gleaned may no longer be timely.

Adopting an augmented intelligence approach

When it comes to exploiting structured data, de-duplication of databases is a critical but time-consuming task. AI can be used to slash the time this takes by cleansing duplicated data sets, provided the data is perfect. Unfortunately, in the real world, it rarely is. In many scenarios, augmented intelligence – a combination of AI and human smarts – is likely to be called for.

Here’s an example of how it can work. Natural Language Processing (NLP) makes it possible to teach machines to understand natural human language, whether verbal or written, within unstructured data sets. By extracting information, including names and phone numbers, from unstructured data sets such as email threads and notes pages, NLP can contribute to a data quality strategy.

This practice can be employed to good effect in what could be dubbed the ‘lazy sales rep’ scenario. The Salesforce system is commonly used to store useful information such as phone numbers and job titles but until now this data has been largely unexploited. Harnessing AI can change this. By labelling words within an unstructured data set, users can give analytics programs the information that’s needed to automate tasks and extend the outcomes of certain labels across larger data sets. 

The upshot of this is that previously unexploited data can be put to use within the business. It’s an example of humans helping machines to help humans – or augmented intelligence in action.

Article by Talend A/NZ Country Manager Steve Singer

GCSB welcomes Inspector-General's report on intelligence warrants
Intelligence warrants can include surveillance, private communications interception, searches of physical places and things, and the seizure of communications, information and things.
Corelight and Exabeam partner to improve network monitoring
The combination of lateral movement and siloed usage of point security products leaves many security teams vulnerable to compromise.
SailPoint releases first identity annual report
SailPoint’s research found that many organisations are lacking maturity in their governance processes over identities.
Disruption in the supply chain: Why IT resilience is a collective responsibility
"A truly resilient organisation will invest in building strong relationships while the sun shines so they can draw on goodwill when it rains."
The disaster recovery-as-a-service market is on the rise
As time progresses and advanced technologies are implemented, the demand for disaster recovery-as-a-service is also expected to increase.
Apax Partners wins bidding war for Trade Me buyout
“We’re confident Trade Me would have a successful standalone future," says Trade Me chairman David Kirk
Avnet to boost AI/IoT solutions with acquisition
The acquisition of Softweb Solutions adds software and artificial intelligence to Avnet’s ecosystem and bolsters its IoT capabilities.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.