Story image

ACC’s computer-aided decision-making questioned by Otago experts

28 Sep 2017

Researchers from the University of Otago are warning government departments of the potential pitfalls in using computer-based risk prediction models.

The researchers say the danger of using such methods was recently revealed when a new tool was used by the Accident Compensation Corporation (ACC) to profile and target clients.

The ACC uses a computer model to assist staff managing claims. However, details of what the model does, and how it is used, are somewhat sketchy, says spokesperson for the University’s Artificial Intelligence and Law in New Zealand Project, Associate Professor James Maclauri.

The university highlights information from a press release issued by ACC indicating that the tool is used to make three types of prediction:

  • Which clients are likely to need help and should be called proactively,
  • Which type of case owner should assist the client
  • How long we should expect a claim to be managed.

Maclauri says, “This somewhat vague description leaves open the possibility that ACC uses these predictions to minimise treatment times, either by intervening in patients’ treatment, or (more seriously) by declining applicants with long predicted treatment times.”

Department of Computer Science’s Professor Alistair Knott, another researcher on the project, adds “The tool makes predictions about future ACC cases using a database of information about 364,000 past claims that were lodged between 2007 and 2013.”

“ACC stresses that details about individual cases are kept private both from ACC staff and from other agencies.”

And management still appears to be ultimately under human control - an ACC ‘case owner’ makes the final decision about each case.

“But ACC workers find themselves in a situation increasingly common in our society: their decisions are guided by advice generated automatically by a machine, based on a large set of data extending far beyond their own experience,” argues Knott.

“We are in the same position when we use Google’s navigation system in our cars, or choose a book based on Amazon’s recommendations. In these cases, having a computer in the decision-making loop seems innocuous enough.”

“It seems less innocuous when it guides the agencies whose decisions have serious consequences for people’s lives.”

“Of course it is a fundamentally a good thing for people’s decision-making to be informed by statistics. Systems like the ACC tool can be quite accurate, but because they don’t reason in the way humans do.”

Knott says it is essential that Governments and companies relying on these systems are able to answer the following questions:

  • How accurate is the tool, in fact? Predictive tools are easy to evaluate: if the public is to have confidence in the ACC tool, and the courts are to evaluate its use, the agency should give a public account of how it was evaluated. Without divulging personal details, this should include a thorough description of the data set on which it was assessed.
  • Can the agency explain the way this tool works so that clients could appeal particular decisions?
  • Does the use of this tool distort the way the agency pursues its stated policy objectives?
  • By ‘passing the buck’ to the machine, is the agency to ducking its responsibility to make fair and humane decisions about treatment of New Zealanders in need?
  • Does this tool implicitly discriminate against individuals on problematic grounds such as age, ethnicity or gender? Importantly, there is simply no way to remove this bias without compromising the accuracy of its predictions. There is a real risk that the ACC tool unfairly discriminates against some clients. This possibility needs to be explored in an evaluation of the system.
  • Are employees effectively trained in the use of the system? While the predictive system is probably intended to be used as a guide, to supplement the case owner’s own knowledge and judgements, it is easy to fall into ‘autopilot’ mode when guided by a system - especially if it is fairly accurate. The charge of ‘falling into autopilot mode’ is often levelled at judges using the US predictive system.

“Predictive analytics technologies show great potential in informing public decision-making, but it is important for these technologies to be evaluated and scrutinised when used in the public domain,” explains Faculty of Law Professor Colin Gavaghan.

“It may be that ACC has addressed the issues we raise in its own internal training and evaluation processes.”

“But we are calling for ACC to provide a public account of how it uses its predictive tool, so as to maintain the integrity of its decision-making.”

Cloud application attacks in Q1 up by 65% - Proofpoint
Proofpoint found that the education sector was the most targeted of both brute-force and sophisticated phishing attempts.
Huawei picks up accolades for software-defined camera ecosystem
"The company's software defined capabilities enable it to future-proof its camera ecosystem and greatly lower the total cost of ownership (TCO), as its single camera system is applicable to a variety of application use cases."
Tech community rocked by deaths of Atta Elayyan and Syed Jahandad Ali
Both men were among the 50 killed in the shooting in Christchurch last Friday when a gunman opened fire at two mosques.
NZ ISPs block internet footage of Christchurch shootings
2degrees, Spark, Vodafone and Vocus are now blocking any website that shows footage of the mosque shootings.
How AI is changing the medical industry
With NVIDIA Clara, developers can speed up their medical imaging applications and implement AI.
The Data Literacy Project expands its library of free courses
Upskilling the workforce in data literacy is fundamental to unlocking business growth.
Digital experience managers, get excited for Adobe Summit 2019
“Digital transformation may be a buzzword, but companies are trying to adapt and compete in this changing environment.”
Interview: Cisco on digital transformation and data centres at the edge
"On-premise we speak English, Amazon speaks French, and Amazon and Microsoft speak something else. But someone has to translate all of that and Cisco is involved with normalising those rule sets.”