IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
DIA accused of being soft on vendor who delivered faulty software
Mon, 30th Aug 2021
FYI, this story is more than a year old

Intelligence systems meant to back up investigations into identity fraud, money laundering and other threats became so degraded at the Department of Internal Affairs that most staff avoided using them.

But attempts to fix the systems came unstuck to the point data integrity was at risk.

This is revealed in documents newly released under the OIA.

One part of the attempt to create an overarching intelligence system between 2013 and 2020 eventually worked, but another part was abandoned - not least because the department did not spell out what it needed.

The documents also show officials managed to claw back $351,000 in a settlement with an unnamed company that failed to deliver the full system.

The case of the $2 million system is another in a long history of IT upgrade woes for government agencies, many of them far more costly, stretching from the $100m failed Incis police computer system] to education's massive Novopay headache.

'Poor tools'

An internal DIA review in 2019 says that, in 2013, the department's units had "poor intelligence tools".

These units investigate and regulate identify fraud, anti-money laundering and countering financing of terrorism, gambling, and community and charity organisations.

"The department's intelligence systems were no longer fit for purpose, and most teams were not using the systems due to integrity and usability concerns," it said.

The DIA lacked any case management tool for investigations, which led to "a limited ability to trust reported data".

Instead, teams were getting by with manual procedures and using spreadsheets.

An aborted start was made on an upgrade, then delayed, then restarted in 2016.

But two years later, in 2018, the project that was meant to deliver a "robust and efficient investigation process" was in such a state it was judged too high a risk to deploy it.

Reviews show fundamental flaws.

"The vendor did not understand the requirements fully, meaning the solution was never going to meet DIA's needs," one said.

"This should have been picked up during multiple stages."

A review found the department was too soft on the vendor.

"When the vendors go wrong, DIA has tendency to fill in the gaps for them," a review said.

"DIA needs to be able to make decisions to stop paying earlier and push back more when deliverables aren't being met."

Part of the outlay was to buy licences to use the system, before it was even shown to work - and it did not work. DIA got back $116,000 of those licence costs.

Defects popped up again and again, from early on.

"The high frequency of issues being discovered means that testing has been in constant cycles of retest", a report in March 2017 said.

This forced compromises so the system was hard to use - and worse, data integrity "may not be as expected", the report warned.

This carried on into late-2018.

Data integrity was at risk from a glitch that was duplicating cases, and a failure to automatically provide a report on errors.

There also were typos in the software code.

"Attachments to entities... were not working.

"Compulsory fields... not validating."

The department pushed the vendor company to provide evidence of testing but "there was a general pushback that issues will be fixed as they arise rather than making sure it is right the first time".

The system "could not be rebuilt from scratch", DIA was told.

Its own oversight had been lacking, and its legal advice was not good enough.

"Risk escalation to the board was too slow and the information provided did not give the board a full enough picture to make effective decisions quickly," a review in February 2019 said.

"There was a lack of ability to remedy issues via the contract as it lacked robustness.

"Better legal advice at contract creation and during the ongoing issues should have been supplied."

DIA had assumed the main vendor had a partnership with another supplier; instead this "was later revealed to be a sub-contract type relationship which started to break down during testing and implementation".

Meanwhile, pressure was piling on to specialist staff whose "time commitments were heavily impacted by the project".

The investigation case management system - which had been so vital as a single place to store case information "in an evidentially sound manner", and to improve reporting and oversight - was written off by the board in early 2019.

The project's budget had been exhausted.

There was on consolation with the intelligence part of the system working. Two of the four key benefits expected in 2016 were met, one was partly met, and one - automated and standardised processes - was not met.

The new system should safely record, capture, analyse, search for and share intelligence information, the reviews showed.

But managers are back to relying on spreadsheets, in part, for security and auditing, and staff are still having to input a lot of data themselves.