Log In | Subscribe | | |

The "software glitch" that wasn't - Deutsche Bank's excuse doesn't make sense

"It was the computer wot done it, guv," is the latest excuse to come out of a major bank caught with its money laundering knickers around its ankles.

There's just one incy-wincy-little point. It wasn't the software - it was the people .. and it's the same problem that makes reliance on so-called AI so dangerous, Nigel Morris-Cotterill says.

** Article free for seven days **

According to the Financial Times " A Deutsche spokesman said one of the bank’s several anti-financial crime systems had been affected. The software, which was put in place around 2010, was designed to retrospectively to look for suspicious patterns of payments processed by clients of the corporate and investment bank. “Two of 121 parameters [of the IT application] were defined incorrectly,” said the bank on Wednesday, adding that the fault was discovered by employees of its anti-financial crime unit after it started work to improve its internal processes last autumn."

So, let's break that down, shall we?

First, the bank has multiple systems reviewing the same data so on the face of it, this isn't a big deal. But what if there are gaps between their coverage? A failure in one system may be in one of those gaps. So, it is a big deal.

What is a bigger deal, however, is the fact that this unidentified software has been in place since 2010. It appears to be a sensible, simple system built when common sense reigned: transactions are reviewed by the system against a check-list. In this case there were 121 checks. Two were not properly set up. That means, either, that the algorithm was programmed to ask the wrong question (i.e. to interrogate the data but did correctly specify which data to use) or was programmed to produce an incorrect analysis and/or response.

This is not a "software glitch." It is human error.

Worse, it's at the heart of where so-called artificial intelligence and machine learning in money laundering risk assessment is going to go wrong.

In programming, there is an ancient expression: GIGO. Created by American programmers, it means "garbage (rubbish) in, garbage out" and it's at the heart of everything computers do.

It follows, then, that to set up systems, the correct data must be located, analysed and reactions done according to accurate requirements. Accurate. The word is an absolute. it is not qualified. It's not "reasonably accurate." Setting it up should be ultra-rigorous and undertaken and reviewed by people who really know their stuff.

That is not what happens. And as so-called RegTech grows, the companies are not seeking out those who have the requisite level of skills and experience. Instead, they are relying on those with arbitrary certification from commercial companies. There are no - literally no - official forms of recognition of those with high levels of skill and knowledge.

So, with adequacy rather than excellence being the status of those setting up the algorithms, for algorithms are what Deutsche Bank is really talking about, blaming the computers or the software isn't being honest. It's like German car manufacturers blaming the on-board computers for falsely reporting emissions when they do so because software engineers wrote fraudulent software.

While there's good reason to point the finger at the bank, there's an even better reason to point it at those who sell tech as a solution. It isn't. It never has been and it never will be. It's a tool. And, as this case shows, if you mess it up at the design or implementation stage it stays messed up. And because people trust computers, it stays messed up for a long time until someone with sufficient authority, or a willingness to be branded a trouble maker, takes a proper, cynical, look at it.

Further Reading: https://www.ft.com/content/d53...
"

Author: 
World Money Lau...