IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
The case for continuous Intelligence in APAC cities
Wed, 4th Aug 2021
FYI, this story is more than a year old

Six years ago, an ambitious plan was hatched for the northern English city of Leeds: to make it ‘the first city in the world that is fully maintained autonomously by 2035'.

The idea was to eliminate disruptive street maintenance, first in Leeds and then in cities across the UK. Rather than dispatching crews, robot and drone fleets would instead roam the neighbourhood, identifying, diagnosing and repairing potholes and performing other civil works.

The concept was briefly raised in Geelong in Victoria last year — possibly inopportunely as the pandemic first struck — but it indicates how smart city thinking is evolving.

In some ways, we already see a lot of precursors to the self-healing city concept, with cities becoming much more intensively data-driven, particularly since early 2020.

Governments, policymakers and planners are dealing with a rapid rate of change and high levels of uncertainty, leading to heightened demand for the ability to model outcomes before taking action.

Technology has also improved to allow more advanced smart city concepts to be realised.

This is evident in the four-dimensional ‘digital twin' models developing in NSW and Victoria that promise city planners a way to visualise the impact of changes on a virtual representation of the urban environment — without impacting the city itself.

We also see it in the rapid rise in people counting and anonymised mobile phone data to track people movements and the use of city assets. This has been used in the pandemic to check compliance with lockdown rules. It also allows councils to more smartly allocate finite resources to where they are most needed at any point in time.

Streaming data is the bedrock for all of these scenarios — and more

Smart cities have previously contemplated how to treat data streamed in from sensor fleets. However, due to transmission costs or reliability, data was often drawn back to a central point for post-processing at set intervals rather than in real-time. That is now changing, with processing occurring on data as it is collected. This allows for pattern identification in real-time.

Out of that comes a new paradigm: continuous intelligence (CI).

The backstory on CI

Gartner has previously identified six attributes of continuous intelligence.

  • Fast, with everything occurring in real-time
  • Smart, feeding all data into analytics platforms and machine learning
  • Automated, limiting or eliminating human intervention to ensure insights are untainted
  • Always on
  • Integrated into the business, as well as the applications it supports
  • Deliver measurable value.

Servicing these attributes requires a new data processing layer that is ultimately the power behind CI.

The main product categories that comprise this new layer conduct their high-speed data processing jobs in-memory and at near light speed. For data at rest, the tool is in-memory computing; for data in-flight, it's stream processing. If the data needs to be preserved, it is pushed to a system of record asynchronously, where it can be stored at a slower speed.

When in-memory computing and real-time stream processing are combined, they create an extremely powerful platform capable of handling data at rest and in motion. They are the data processing foundation for continuous intelligence.

Making continuous intelligence platforms is not only about delivering automatic processing of data to create intelligent applications. The platforms themselves need to be intelligent to scale up and down, deal with failures and resilience, tune performance and autonomously identify bottleneck code. Platforms are moving towards more self-tuning and self-management, in addition to enabling these characteristics in customers' own environments.

Fast-acting

Continuous intelligence comes into its own when there is time pressure to act. The narrower the window of opportunity is to take action to prevent a problem, the more these technologies come into play and add value.

Sensing data in real-time and responding at low latency is of particular value for things like industrial monitoring.

There are clear applications in a smart city context where real-time data could be used to mitigate against emerging transport bottlenecks — before they result in major congestion.