Jeff Judy

Jeff's Thoughts - February 10, 2016

Do You Use Software, or Does Software Use You?

When it comes to gadgets and technology, are you an early adopter?

Some people will wait in line for hours to get the latest smartphone. Some can't resist the latest device or newest software. They grab it and dive right in. Learning how it works and using it for real tasks are all the same process.

That's definitely not me. I've been dragged kicking and screaming into today's technology. For instance, I was using overhead projectors and real transparencies long after most other presenters had abandoned them.

But if I am a late adopter, I like to flatter myself that I am a wise adopter. I move to new technology, not when it becomes available, but when I can see the benefit it brings.

So today, from word processing to e-mails to presentations from my tablet or via webinar, I use up-to-date tools to make me more effective for my clients. I am completely comfortable using a wide range of technology to achieve my goals.

What is truly ironic, however, is that I regularly train staff from credit institutions in how to use their software to make better decisions!

As we all know, computers and software are crucial elements of credit analysis and credit decisions. But at many institutions, software has gotten the upper hand. It is hard to say whether they use the software to gain insights into the borrower's condition, or whether the software uses humans to input data and produce reports!

Oh, I know, the software vendors provide training with their products. But a lot of that training is technical: how to do input, how to set parameters, how to select reports, and a cursory summary of what information is featured in the output.

That training will get you started using the software. But it may not be enough to teach you to use the software well.

The challenge is to apply "credit thinking" to both the input process and the output. Input is actually a series of decisions, of gathering information and categorizing it in various ways.

But does the same application generate the same input when prepared by different staff at different locations? Do those input patterns reflect credit policy and guidelines, or are they driven by attempts to game the system? If you audited just the inputs across your institution -- something I recommend doing periodically -- would you be pleased with the consistency of the process?

More importantly, do credit staff really take advantage of all the information that is in the reports? Do they get beyond exceeding thresholds and approve/decline recommendations to truly understand where the borrower has been and where they are going (with your money)? Does an application that "passes" just get booked, or does someone take the time to see whether the reports suggest additional opportunities with that customer, or perhaps point to clues to better structuring of the credit?

We are all strapped for time, and letting the software do your thinking for you is very tempting. It falls to the approvers and bank leadership to hold credit staff accountable for using software in its proper role: to do the heavy lifting of the number crunching in order to free up time for quality thinking about a credit request.

When every institution uses similar software to crunch numbers, the ones that thrive make more thoughtful use of the numbers going into and coming out of their software. To enjoy a competitive advantage, make sure your credit staff have the knowledge and guidance to use, rather than just serve, your software tools.